The Cupertino company has reacted furiously.
Apple CEO Tim Cook has published an extremely strongly worded letter, calling the demand “chilling,” arguing that it “would undermine the very freedoms and liberty our government is meant to protect.”
So what’s the big deal?
This was sparked by the San Bernardino mass shooting
First, some background on the case.
FBI investigators are trying to access data on the phone of one of the San Bernadino shooters, who killed 14 people and injured 22 more in a mass shooting in California in December 2012. They’re looking to work out how they were influenced by Islamist terror groups, according to The Guardian.
However, the phone’s owner — Syed Farook — was killed in a subsequent shoot-out. The device in question — an iPhone 5c — was encrypted using Apple’s default software, meaning no-one is able to access its data without the correct passcode, including Apple and FBI investigators.
As such, the FBI has taken Apple to court to try and get its help in unlocking the phone. It isn’t trying to get Apple to remove the encryption on the device altogether. It is trying to get Apple to create software that bypasses the limit on the number of passcode attempts you can enter before the device auto-wipes — which would let investigators brute-force access to the device by trying every possible combination.
A US magistrate ordered Apple to assist the FBI in this on Tuesday.
Apple has indicated it intends to appeal — for reasons we’ll get to shortly.
There’s an ongoing war over privacy and lawful access to data
This court case isn’t taking place in a vacuum. We’re in the middle of a bitter feud between tech companies and law enforcement about the rise in the use of encryption.
In the years following NSA whistleblower Edward Snowden’s revelations about the US government’s mass surveillance programs, there has been a heightened awareness of privacy issues, and moves to strengthen protections on consumer products.
Apple has been one of the strongest voices in support of this move, and all new iPhones and Apple devices are now encrypted by default.
This has, predictably, infuriated some in law enforcement, who argue that vital evidence is “going dark.” (Note: A recent Harvard study claims that rather than “going dark,” investigators have more evidence at their fingertips than ever before.)
James Comey, director of the FBI, is an advocate for backdoors into encryption products to allow law enforcement access when required, and there have also been legislative calls to mandate encryption backdoors.
Technologists and privacy advocates are strongly resisting this. There are a number of arguments against encryption backdoors, including that they would be liable to abuse by malicious third-parties, that they would be ineffective because the criminals they intend to catch would simply switch to uncompromised encryption tools, and that it would set a dangerous precedent for authoritarian regimes to demand backdoor access from tech companies so they can crack down on activists and dissidents.
Apple is angrily rejecting this “overreach by the U.S. government”
Let’s get back to the San Bernadino case. What the FBI is asking for perhaps isn’t a backdoor in the traditional sense — it’s not an extra encryption key held in escrow that would let investigators immediately decrypt the iPhone data they’re after.
But in an open letter published on Apple’s website, CEO Tim Cook argues that it amounts to a “backdoor” — and that it’s extremely “dangerous.”
Cook says what the FBI is asking for does not currently exist, and Apple would have to make it. “The FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.”
He argues that complying will make ordinary people “less safe.” In his words (emphasis ours):
The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.
We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.
The Apple CEO then describes the demand as a “dangerous precedent,” which would grant the US government “the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.”
He concludes: “While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.”
Activists are rallying in support of Apple
The Electronic Frontier Foundation (EFF), a civil liberties group, is supporting Apple. It says it worries that the FBI’s demands set a precedent, and if Apple is forced to create the code then it will be used again and again.
“For the first time, the government is requesting Apple write brand new code that eliminates key features of iPhone security — security features that protect us all,” EFF deputy executive director Kurt Opsahl writes in a blogpost. “Essentially, the government is asking Apple to create a master key so that it can open a single phone. And once that master key is created, we’re certain that our government will ask for it again and again, for other phones, and turn this power against any software or device that has the audacity to offer strong security. “
Opsahl adds: “The U.S. government wants us to trust that it won’t misuse this power. But we can all imagine the myriad ways this new authority could be abused. Even if you trust the U.S. government, once this master key is created, governments around the world will surely demand that Apple undermine the security of their citizens as well.”
This isn’t about one iPhone. If this precedent gets set it will spell digital disaster for the trustworthiness of any and every device.
— Kevin Bankston (@KevinBankston) February 17, 2016
So what happens now?
The court order ordering Apple to assist the FBI finishes like so: “To the extent that Apple believes that compliance with this Order would be unreasonably burdensome, it may make an application to this Court for relief within five business days of receipt of the order.”
CEO Tim Cook has made it clear his company is “opposing” the order, because “we feel we must speak up in the face of what we see as an overreach by the U.S. government.”
Now comes a (likely lengthy) legal showdown between the FBI and Apple — one that privacy activists and law enforcement will be watching extremely carefully.
NOW WATCH: This bottle makes water out of air
Business Insider Emails & Alerts
Site highlights each day to your inbox.