Last week, the fight between Apple and the United States government escalated to a nasty new level.
But US president Obama’s offhand statement about “fetishizing our phones” provides some new insight that can show why this case evokes such heated reactions on both sides — and why the government will probably win on this issue, even if it loses this specific case.
To recap, the FBI wants to make Apple help it unlock an iPhone used by Syed Farook, who with his wife Tashfeen Malik, killed 14 people in December in San Bernardino. The government has defined the mass shooting as an act of terrorism.
On February 16, a District Court ordered Apple to comply, and since then Apple and the Department of Justice have exchanged shots back and forth with increasingly aggressive language, culminating in Thursday’s statement from Apple calling the government’s latest brief “deeply offensive” and “desperate.”
There’s a lot of heated rhetoric on both sides of this argument.
What techies think
Apple and a lot of the tech community argue that asking Apple to help the FBI overcome the iPhone’s built-in security mechanisms is the first step on a slippery slope that would eventually weaken the security of our tech products to the point where they’re easily hackable by bad guys.
In this particular case, the FBI is not asking Apple to defeat the iPhone’s encryption — the actual technology that is used to lock data on the phone. Rather, the government wants Apple to create a special new version of the iPhone’s software that would allow the FBI to enter an infinite number of password guesses, very quickly, until it finds the right one that would unlock the iPhone. Apple’s dubbed this custom version of its software “GovtOS.”
Regardless, the slippery slope argument goes like this: if the government can force Apple to help it bypass security in this case, it can later force Apple to do other things like build back doors into the iPhone’s encryption, so that the government can more easily get information from all iPhones as long as it has the proper court orders.
Once the tools to break security are out there, it’s almost inevitable that they will fall into the wrong hands. That means that bad guys will be able to break into iPhones and steal data just as easily as the government can.
In other words, this isn’t just a civil liberties argument. It’s a technology argument, and if the government wins, it will paradoxically make it harder to protect against anyone who’d try to get private information from our phones.
“Fetishizing our phones”
The government and those who agree with it argue this is like any other court order — like a search warrant, or the discovery in a court case — that forces it to give up private information to help with a law.
President Obama on Friday spoke a little bit about his point of view on encryption without specifically mentioning Apple by name:
My conclusion so far is that you cannot take an absolutist view on this. So if your argument is strong encryption no matter what and we can and should in fact create black boxes, that that I think does not strike the kind of balance that we have lived with for 200-300 years and it’s fetishizing our phones above every other value.
Obama’s statement is actually very useful in thinking about this issue.
Smartphones are different from other kinds of computing devices. They’re deeply personal. We carry them with us everywhere, check them 24 hours a day. They contain a microcosm of our entire lives — our contacts, the web sites we look at, all the different ways we communicate electronically (voice, text, email, work apps, all our social networks, and on and on), and how we spend our leisure time.
Apple understands this, and has built ever-tighter security into the iPhone as a selling point. It’s OK to trust your entire life to this device, it’s secure. (This is important not only to individuals but to corporate IT departments who are naturally terrified of all the work-related stuff we use our phones to communicate.)
But the courts have historically given the government — and private actors in legal cases — all kinds of power to violate our privacy in the name of preventing and prosecuting crime.
With the proper legal orders, the government may tap our phones, pick our locks, put hidden microphones in our homes and workplaces, and scour our computer records.
What’s so special about an iPhone?
On a technical level, the techies are absolutely correct. Once you’ve broken security for one actor by building a back door, that security becomes a lot less valuable. It will still keep out the masses, but any technically sophisticated party will eventually be able to walk right through the back door, either with stolen tools or by developing their own.
In this particular case, Apple might win. This is a lot more than the government getting an order to do what it’s technically knows how to do, like tap a phone line or pick a lock. This is more like the government saying that the company who designed an uncrackable safe must come up with a way for the government to crack it.
Personally, I distrust the government’s assurances that this kind of technology will only be used in specific types of cases. As we’ve seen most recently in the case of the NSA’s wiretapping data being used by government agencies in cases that go far beyond terrorism, the slippery slope argument has a lot of historical precedent. Open the door a crack, and it will eventually be thrown wide open.
Nonetheless, it’s hard to argue that the iPhone is different from any other kind of device. If the government wants a way to get data from it, the government will find a way.
This brings back the most important thing to remember about computer security, which older folks learned when computers first became common, but which folks who grew up online might not have absorbed yet: Nothing you do, or say, on your computer is private by default. Assume it’s public. If you want privacy, you’re responsible for providing it yourself.