At first, the battle between Apple and the FBI seemed straightforward. The FBI wanted the phone used by terrorist Syed Rizwan Farook who with his wife killed 14 people in San Bernardino last December, to be opened in order to retrieve forensic data. Surely law enforcement agencies often find themselves with a piece of technology, iPhone or otherwise, with which they need assistance accessing. For many of us including criminals and terrorists, life is conducted via one’s iPhone. It is reasonable to imagine that the iPhone would have data of interest to investigators: contacts, texts, calls, notes, GPS logs, and the like.
What was the big deal?
Then more details emerged. We learned that the FBI was actually asking Apple to write code/create software that would break the phone’s encryption. The newest iPhones are encrypted in a way that older models, such as Farook’s 5C, were not.
Apple argues that its raison d’être as a business, the essential value of its brand, is that its phones are not hackable. They state that when the federal government asks Apple to write such code it is compelling it to go counter its responsibility to secure its customers’ data.
Craig Federighi, Apple’s Senior Vice President of Software Engineering, wrote in last Sunday’s Washington Post that smartphones are central to our lives and a very tempting criminal target. That is the reason for the iPhone’s encryption being built into the phone. He states that “Doing anything to hamper that mission would be a serious mistake,” and that is “why it’s so disappointing that the FBI, Justice Department and others in law enforcement are pressing us to turn back the clock to a less-secure time and less-secure technologies.”
I tried coming up with a metaphor to better understand Apple’s position. Let’s say that my company is in the business of constructing safe rooms. Our customers rely on us to build ‘bullet proof’ solutions that are impenetrable and truly secure over which they have total control. Local city officials are demanding that I create a way to gain access into these safe rooms on the grounds that a government agency might one day need access to it. Under this scenario, as a business owner and as a client, I might well be a bit uncomfortable.
Apple, as well as digital rights groups and security experts, have argued that an encryption breaking tool would almost certainly get loose, and endanger other users of Apple devices. (Why would the tool get loose? I’m not sure that’s a given.)
And given the onslaught of IT crime of every size, from every corner, surely everyone agrees that we need to be constantly vigilant moving towards more secure not less, and that encryption is a good thing? Encryption per se does not seem to be the real point of disagreement here.
Some ask why the FBI isn’t able to break open the phone themselves. Or at least, ask the NSA to step in to help. The NSA estimated annual budget is in the billions, they employ tens of thousands of technology experts and their middle name is Hacking. Never mind that information about its budget and size derives from the NSA’s security weakness – Snowden. In any event, politics intervenes in that scenario as well. The NSA wouldn’t want to get involved on the chance that its hacking methods were made public, perhaps in a court venue. Wouldn’t the government want the same level of encryption security that Apple provides to have been in place when its systems at the OPM were hacked to the tune of over 21 million records in 2014 and 2015?
Some security experts believe that the FBI is in fact capable of hacking the phone without Apple’s aid but that the Feds would rather have Apple do it, creating legal precedent that would give law enforcement agents easier access to data. FBI Director James Comey insisted at a congressional hearing that “we have engaged all parts of the U.S. government to see, does anybody have a way, short of asking Apple, to do it, with a 5C running iOS9, and we do not.”
Another analogy for understanding this spat is the relationship between a doctor or lawyer, providing a professional service to a person. That relationship and the information passed between the parties is privileged. A lawyer cannot be compelled to share information that would incriminate their client. The prosecutors are obliged to find other ways to get the information, to find a workaround that does not conflict with individual rights. The role of the FBI is to fight crime. Is a private corporation like Apple obliged to take on that role when asked?
The RSA Conference is one of the largest information security conferences in the world. It began in 1991 as a forum for cryptographers to gather and share info and advancements in IT security. At last week’s RSA 2016 conference, two members shared a panel and views that might be surprising to some: former HLS head Michael Chertoff said that “If we ask private sector to be in control of security, then we have to allow them to have tools to carry out that mission.” Former NSA head Mike McConnell remarked that “When you understand that level of extraction of intellectual property, it’s logical that ubiquitous encryption is something the nation needs to have.”
The RSA 2016 website explained their mission:
… While our instantly-connected world offers tremendous benefits, it also has a downside: the proliferation of malicious attackers who are constantly developing sophisticated methods to steal our data and disrupt our lives. 25 years ago, RSA Conference was created so professionals could reach out to each other and collectively address growing cyber security threats. Today, RSA Conference promotes connections not only among the information security community, but also between IT and other parts of the enterprise, private and public sectors …
So, at the end of the day Apple is saying that encryption helps safeguard personal data from getting into the wrong hands. And the FBI and law enforcement say criminals are benefiting from encryption because it shields them. This year’s RSA conference cogent theme is Connect to Protect. One would think the best case scenario and the path least likely to cause harm is having both sides connect, work together – Apple’s red team of ethical hackers and a special team of FBI techies – to find a way to protect both the nation and the individual.