With contributing author Rachel M. Riley
There’s a conversation going on right now that sounds like this:
Government: “Give us data. We want to make you more safe!”
Company: “No, thanks. We’d rather put our customers at risk.”
Wait. That doesn’t sound right. We must have misheard it. It must be more like this:
Government: “Give us data. We want to make you more safe!”
Company: “Sure. Take it all. Here you go. Keep us safe.”
Could we have misheard that, too? Or – gasp! – could those two separate conversations be occurring simultaneously? And how is that possible?
Quick answer: That is basically the tension that we are seeing right now, the classic standoff between privacy and security.
Economists like to call it a zero-sum game, where one side’s gain is directly responsible for the other side’s loss. But economists are peculiar people and tend to want to assume things that aren’t necessarily there, so we’ll chart our own course. (Example: Three economists go hunting and come across a deer. The first economist fires and misses by one foot to the left. The second economist fires and misses by a foot to the right. The third economist stands up and shouts, “We hit it!”)
We instead like to think of this tension as a pendulum, with security on one side and privacy on the other. When the pendulum swings toward security, then more and more privacy interests are sacrificed for security. And vice versa.
In mid-September, Apple issued an open letter about its commitment to privacy. One of the company’s key points was that the encryption of its new mobile operating system, iOS 8, prevents anyone but the end users from accessing the content of their iPhones or iPads that are running the system. The takeaway point? Well, let’s just ask Apple itself: “On devices running iOS 8, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes and reminders is placed under the protection of your passcode. Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it’s not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.”
Google’s newest Android operating system is going to offer encryption as well. Google previously offered encryption as an option, but will now apply it by default. (Q: Why did the Stormtrooper choose an iPhone? A: Because he couldn’t find the ’Droid he was looking for.)
There’s a lot to, well, decrypt, here. First, let’s get a little historical perspective. Can you imagine what the reaction would have been like if a company announced that it was developing a system so secure that law enforcement could not access it on the day after the 9/11 attacks? Mass hysteria, right? Apple stock prices plummeting? Because that was when the pendulum took the hardest swing to the security side, resulting in the passage of the Patriot Act just six weeks after the attacks. No company would have wanted to be perceived as shielding potentially vital information from the government.
Contrast that time with the present, in a world where NSA spying accusations, Edward Snowden, nude celebrity photographs, Target credit card breaches and Sony information leaks have caused the focus to shift away from the possibility of a terrorist attack on U.S. soil and on to the basic privacy protections that citizens expect for their personal information. According to polls taken in July 2013 after the Snowden revelations, for the first time since the 9/11 attacks, Americans were more worried about the infringement of civil liberties than terrorism.
But the fundamental nature of a pendulum is to swing back … which is exactly what happened after the recent Paris terrorist attacks and North Korean hacking activity. Once again, a Washington Post-ABC poll showed that 63 percent of Americans preferred protection to privacy. When the next inevitable data breach of a major retailer compromises thousands of pieces of consumer data, can you guess which way the pendulum will swing?
You may be wondering what the urgency is around this issue right now. After all, this tension has existed for years, right? Well, this time, there is a very stark distinction: this is the first time that a major piece of technology has the potential to prevent law enforcement from acquiring important – and private – information.
Think about it. Before the question revolved around whether the government could establish sufficient proof to access information. Now though, you have companies who are proposing to block law enforcement’s access irrespective of whether it has proof. And what’s more, Apple is able to use the shielding of information from the government as a marketing ploy – and Google immediately matches.
But while the broad-based approach of the NSA has fueled public sentiment toward privacy, the real impact of encryption will be felt by the police officers and federal agents who are investigating non-terrorism cases and who have historically relied on Constitutionally-permissible searches based on a warrant that is obtained by making a showing of probable cause to a judge.
How would this impact investigations? There are many real-life examples. Here’s a hypothetical: a Starbucks employee calls the police after two independent customers have complained that a man in the coffee shop is viewing what is clearly child pornography on an iPad. When detectives arrive and ask to see the iPad, which runs iOS 8, the device is turned on but requires a password for access. The man declines to provide a password and asks for an attorney.
Now what? Since Apple encrypted its devices, what can the detectives do? And what if the man is part of a child pornography ring – aren’t linkages that would otherwise appear on the iPad now lost? Apple’s take – and Google’s take – is that the privacy concern is paramount.
What’s your take?
This kind of situation is why so many in law enforcement – including the Attorney General, the Director of the FBI and the Manhattan District Attorney – are adamantly against what Apple and Google have done. It’s why people like John J. Escalante, Chief of Detectives for the Chicago Police Department, say, “Apple will become the phone of choice for the pedophile. The average pedophile at this point is probably thinking, I’ve got to get an Apple phone.”
Because of this type of issue and the government interest in preventing terrorism, law enforcement and the White House have called for companies to build a back door into their encryption walls for law enforcement access. In essence, the government intends that no means of communication exist that the government cannot access in some fashion. So it’s no more iOS 8 encryption, no more Android encryption and perhaps heavy caution for apps like SnapChat, that delete user messages immediately without saving them. The technological advances force the government to restrict innovation to protect security.
Is there a happy medium? Most everyone agrees that building a law enforcement back door into iOS 8 or Android is a bad idea, and one that will lend itself to exploitation by determined hackers. After all, hackers are outstanding at exploiting vulnerabilities to access places where they should not be – just imagine if a vulnerability is built right into the system. The Washington Post’s Editorial Board, which predictably calls for a balance between security and privacy interests, offers that, “with all their wizardry, perhaps Apple and Google could invent a kind of secure golden key they would retain and use only when a court has approved a search warrant.”
A golden key steeped in wizardry? And surely riding on the back of a unicorn.
There really is no good answer here. Security and privacy do not coexist well together. The real question is whether existing laws can ensure that our courts – the neutral arbiters – are equipped to make these determinations on a case-by-case basis.
Because unfortunately, on either side of the pendulum, there is a deep, dark and perilous pit.