Hillary Clinton was forced to wade into the ongoing debate over encryption at Saturday’s Democrat Debate.
The former Secretary of State and presidential nominee hopeful gave a muddled answer when asked whether she would move to curtail the technology is she became president — arguing that while “maybe the back door is the wrong door,” it nonetheless “doesn’t do anybody any good if terrorists can move toward encrypted communication that no law enforcement agency can break into before or after.”
The problem is: These two statements are arguably totally contradictory.
Some quick background
Encryption is in the news at the moment, but it’s not new by any means. It’s a way of protecting communications or data so that it is incomprehensible without the correct passcode or key, and various forms of encryption software has been around for decades.
However, there has been a move by consumer tech companies to incorporate strong encryption into their products in recent years, following revelations of US government surveillance by NSA whistleblower Edward Snowden.
Apple is one of the loudest advocates for the tech, and the iPhone now has disk encryption enabled by default, preventing it from being accessed by anyone without the passcode — and that includes Apple and law enforcement. The same goes for iMessage: Apple says it cannot intercept and decode messages sent on the service, even when handed a warrant by a court.
This is — predictably — infuriating law enforcement, as data they once had access to is “going dark.” FBI director James Comey is one particularly prominent critic of encryption, claiming that it aids terrorists (Comey’s remarks were referenced in the question Clinton was asked at the Dem Debate). There are calls for tech companies to incorporate “back doors” into their encrypted products to avoid their misuse — letting the authorities access their contents when required.
So what’s the issue?
The problem is that building a backdoor into encryption inherently makes the entire system weaker. “You can’t have a back door that’s only for the good guys,” Apple CEO Tim Cook (among many others) has said. Once it’s there, it’s open to abuse by
anyone who can find it — defeating the whole point of encryption, strengthening security.
And encryption isn’t just used for messages — it’s also essential for online commerce, for sensitive commercial communications, and for the transfer of vital military data. You can’t mandate backdoors without putting all this in jeopardy, and placing limits on the use of encryption for certain types of product (like messaging apps) wouldn’t work because it’s trivial for any would-be criminal to switch to an alternative app built outside of that government’s jurisdiction — avoiding the back door.
So even if you banned encryption, terrorists would still use it.
(Plus, it sets a worrying political precedent. If the US government demands Apple introduce back doors to catch terrorists in the US, what’s to stop China demanding Apple introducing back doors to catch political dissidents?)
Hilary’s encryption non-answer
At the Debate, Hilary Clinton — the clear frontrunner for the Democrat nominee for President — was asked whether she would “force [Apple CEO Tim Cook] to give law enforcement a key to encrypted technology by making it law?”
She responded: “I would not want to go to that point.” So far, so good. She also claimed that “maybe the back door is the wrong door, and I understand what Apple and others are saying about that.” It’s positive rhetoric for technologists and privacy activists.
But, she also argued, “it doesn’t do anybody any good if terrorists can move toward encrypted communication that no law enforcement agency can break into before or after. There must be some way.”
She went on: “I just think there’s got to be a way, and I would hope that our tech companies would work with government to figure that out. Otherwise, law enforcement is blind — blind before, blind during, and, unfortunately, in many instances, blind after.”
There is one way that tech companies can work with governments to solve this problem — back doors. But, for the reasons outlined above, those in tech almost unanimously reject them. Encryption is based on mathematical principles: Trying to prevent the use of encryption products that governments cannot access is akin to trying to ban maths, the argument goes.
Clinton is clearly trying to be inclusive in her answer — calling for a “Manhattan-like project” to “bring the government and the tech communities together to see they’re not adversaries, they have got to be partners.” She’s appeasing both sides, trying to appear sympathetic to law enforcement while recognising techies’ genuine concerns.
But her rejection of back doors while still calling for law enforcement access just doesn’t work. Clinton is trying to have her cake and eat it, too.
Here’s Clinton’s full answer (transcript via TIME):
QUESTION: Secretary Clinton, I want to talk about a new terrorist tool used in the Paris attacks, encryption. FBI Director James Comey says terrorists can hold secret communications which law enforcement cannot get to, even with a court order.
You’ve talked a lot about bringing tech leaders and government officials together, but Apple CEO Tim Cook said removing encryption tools from our products altogether would only hurt law-abiding citizens who rely on us to protect their data. So would you force him to give law enforcement a key to encrypted technology by making it law?
CLINTON: I would not want to go to that point. I would hope that, given the extraordinary capacities that the tech community has and the legitimate needs and questions from law enforcement, that there could be a Manhattan-like project, something that would bring the government and the tech communities together to see they’re not adversaries, they have got to be partners.
It doesn’t do anybody any good if terrorists can move toward encrypted communication that no law enforcement agency can break into before or after. There must be some way. I don’t know enough about the technology, Martha, to be able to say what it is, but I have a lot of confidence in our tech experts.
And maybe the back door is the wrong door, and I understand what Apple and others are saying about that. But I also understand, when a law enforcement official charged with the responsibility of preventing attacks — to go back to our early questions, how do we prevent attacks — well, if we can’t know what someone is planning, we are going to have to rely on the neighbour or, you know, the member of the mosque or the teacher, somebody to see something.
I just think there’s got to be a way, and I would hope that our tech companies would work with government to figure that out. Otherwise, law enforcement is blind — blind before, blind during, and, unfortunately, in many instances, blind after.
So we always have to balance liberty and security, privacy and safety, but I know that law enforcement needs the tools to keep us safe. And that’s what i hope, there can be some understanding and cooperation to achieve.
Here’s a video:
NOW WATCH: Tech Insider videos
Business Insider Emails & Alerts
Site highlights each day to your inbox.