- Apple’s decision not to unlock or create a backdoor into the iPhones used by a gunman in a Florida shooting last month puts the tech giant at odds with the United States government yet again.
- Security experts agree, however, that circumventing the iPhone’s security poses a significant risk to iPhone users since it would provide a means to obtain private data that even Apple can’t presently access.
- There’s a risk that such a tool could fall into the wrong hands, some experts warn.
- Visit Business Insider’s homepage for more stories.
Attorney General William Barr recently called on Apple to help unlock the iPhones used by a gunman in Pensacola, Florida last month – a situation that once again requires the tech giant to balance protecting consumer privacy with its legal obligation to assist in investigating a shooting that’s resulted in the loss of American lives.
But security experts agree that providing access to the shooter’s iPhone could jeopardize the security of the millions of iPhones in use around the world.
“In essence, you’re trying to make a weapon that can only be used on a single target,” Jacob Doiron, an information systems lecturer at San Diego State University, said to Business Insider. “But that’s not the nature of weapons, or exploits. They are applicable to any device that has that profile or configuration.”
On Monday, Barr said that Apple had not provided any “substantive assistance” in getting access to two iPhones belonging to the shooter, Mohammad Alshamrani, who killed three people at a naval airbase last month. But Apple has since refuted that characterization, saying that it had provided iCloud backups, information, and other data from Alshamrani’s account in cooperating with the investigation. Now, Apple is reportedly gearing up for a legal battle with the Department of Justice to defend its position, according to The New York Times.
“We have always maintained there is no such thing as a backdoor just for the good guys,” Apple said in a comment to Business Insider. “Backdoors can also be exploited by those who threaten our national security and the data security of our customers.”
Apple took a similar position in 2016 when it was caught in a stand-off with the Federal Bureau of Investigation over whether it should unlock an iPhone linked to a shooting in San Bernardino, California. Apple refused to unlock the iPhone, and the FBI ultimately ended up working with a private companyto gain access to the device.
The crux of the issue when it comes to unlocking an iPhone or bypassing its encryption , according to privacy experts, is that once Apple creates a backdoor, there’s a risk that it can be used in unpredictable and in some cases harmful ways.
“I would say the chances of it falling into the wrong hands are 100%,”said Mark Nunnikhoven, vice president of cloud research for cybersecurity firm Trend Micro.
There’s also the question of why Apple couldn’t just create the tool for the purposes of the investigation and then push an update to iPhones that would render it obsolete. For that to work, the backdoor would have to be tied to the software only, not the iPhone’s hardware, says Doiron. “Sometimes these vulnerabilities take place on the hardware, level,” he said. “That’s not something that could be fixed via software.”
“We’re on your side”
The broader issue, however, may be that creating such a tool would put private, encrypted data from iPhone users in the hands of Apple and its employees – a privilege the company doesn’t want to begin with. Such a move that would be in stark opposition to Apple’s stance on consumer privacy.
“You are not our product,” Apple CEO Tim Cook said in an interview with ABC News last year. “Our products are iPhones and iPads. We treasure your data. We want to help you keep it private and keep it secure. We’re on your side.”
Theoretically, if Apple were to create some type of tool or key that would provide backdoor access to encrypted iPhone data, employees from Apple would have access to that information as well since they would likely be assisting in the investigation. What’s to prevent an Apple worker from going rogue and possibly leaking iPhone user data, or using the tool for nefarious purposes?
Nunnikhoven pointed to EternalBlue as an example of how a tool built for specific purposes could fall into the wrong hands. EternalBlue was a National Security Agency hacking tool that leaked to the public in 2017 that was linked to the WannaCry ransomware attack that infected computers all over the world during that same year.
Creating the tool in general would also require a significant effort on Apple’s part. It’s not simply about cracking the passcode of the device, but would likely require that a dedicated team at Apple create a piece of software capable of accessing the data stored on the device, says Nunnikhoven. The government, in other words, is asking Apple to enable something that isn’t even possible on iPhones today.
Unlocking these iPhones for the Pensacola investigation would also likely set a precedent for law enforcement agencies to request similar treatment for future cases as well, says Matt Wilson, chief information security advisor at BTB Security.
“It’s just more evidence to prove this isn’t just [cybersecurity experts] saying, ‘I don’t want to think about it,'” said Wilson. “It’s [experts] saying we’ve thought about it very long and very hard, and we don’t see a viable way that addresses all of these issues.”