Apple, encryption, iPhones, and the FBI plainly explained
Most Americans, and many of the world’s iPhone users, are now aware that a court order was filed on February 16 to compel Apple to assist the FBI in retrieving information from an iPhone. This was the phone uncovered in the aftermath of the mass shooting in San Bernardino in December last year. Apple objected to the FBI’s demands and very public legal maneuvering ensued.
In this article I endeavor to explain some of the key issues that this situation raises, for both privacy and security, as they impact companies, consumers, and governments.
To take the technical issues first, I’d like to clarify some terms that have been tossed around, such as encryption. The Apple iPhone can encrypt information stored on it, meaning the data are scrambled so that nobody can make sense of them without the key, like this: xmmib fmelkb. That is "Apple iPhone" encrypted with the Caesar cipher, to which the key is three, specifically shifting the alphabet by three letters. You could probably figure that out if you stared at it for a while; it’s a highly predictable substitution cipher, so a is always x and so on.
Yet this illustrates a couple of important points. The FBI is not asking Apple to reveal or break the way the encryption process works on the iPhone, nor are they asking for the key (Apple does not have the key). The FBI wants Apple to remove impediments to guessing the key.
Many of the encryption systems we use on digital devices today create very complex keys using random data and very long numbers. But there is a user element as well, a passphrase or passcode that we choose ourselves. On iPhones this began as a four-digit number, of which there are 10,000 possible combinations (10)4. With the iOS9 version of the operating system you can choose six digits, offering 1 million possible combinations (10)6. But letters are also allowed in passcodes now, vastly expanding the number of guesses it would take to crack a truly random passcode.
For security conscious users, iOS allows up to 37 characters with the Custom Alphanumeric Code option. Drawing from 77 possible letters, numbers, and symbols that yields a staggering (77)37 possibilities. On top of that, the OS includes a delay function that slows down any "brute forcing" of the passcode by a series of guesses. And if that were not enough, iOS is designed to erase the data on the phone after 10 wrong guesses.
So how could the FBI get to the contents of this phone? By updating it with a specially crafted version of iOS that removes the lockout and delay functions, permitting the FBI to execute a highly automated series of guesses at high speed. Ironically, what makes this approach technically infeasible for the FBI, but feasible for Apple, is the company’s entire security strategy for its products: tight control over suppliers of hardware and software. That’s why most of us get all our apps from the Apple App Store. And it’s why Apple has the only digital identity that your iPhone will trust as a source for an iOS update.
So why is it a big deal for the FBI to ask Apple to go through with this? The request sounds quite reasonable if you frame it like this: "Please use your uniquely trusted identity to disable the protective features you created and thereby enable us to guess the passcode and get at the potentially life-saving evidence on just this one device". The problem that many technology companies have with this request is also, in my opinion, the strongest of Apple’s numerous objections: The "just this one device" claim.
There is no technical or legal basis for saying this case is a one- off. If Apple complies with the current court order and creates a version of the OS that facilitates access to this one iPhone, it can be used on other iPhones. Other law enforcement agencies will join the line that is already forming to demand Apple’s assistance with other iPhones, and Apple will have no basis to refuse because that’s the way the legal system works.
It’s called precedent and, while some FBI statements seem to imply that this case would not set a precedent, I am sure it would. I say this not as a lawyer, but as someone who has read the Department of Justice "Motion to Compel" filed on February 19. The agency’s argument that Apple should be compelled to comply with the court order makes extensive use of precedents arising from prior applications of the statute invoked by the FBI in this case: The All Writs Act of 1789.
Don’t be distracted by that date. A lot of US case law regarding searches by the government is derived from the Fourth Amendment, which is almost as old as the All Writs Act. The FBI invoked that act because there is no US legislation that speaks to the role of computing device manufacturers in providing evidence to law enforcement investigations. For that matter, there is no clearly stated right to information privacy or data protection in the US constitution or any of its amendments. Absent legislation spelling out the obligations of innocent third parties to help law enforcement access evidence of a crime, the All Writs Act comes into play.
A 1977 Supreme Court decision in the case of United States v. New York Telephone upheld the use of All Writs to compel a phone company to help the government track telephone activity related to an illegal gambling case (at a time when there were no statutory provisions for that). The government’s case against Apple relies heavily on that ruling and if Apple loses in court, a further precedent will be set, one that can be used in cases impacting many aspects of our digital life. Any number of agencies will have a strong legal basis for requiring any hardware and software makers to selectively turn off security features to assist government investigations.
Right now, the outcome of FBI v. Apple is impossible to call, with George Washington University Law School professor Orin Kerr likening it to "a crazy-hard law school exam hypothetical in which a professor gives students an unanswerable problem just to see how they do". Yet, even as this case descends deeper into the weeds of US law, there are clearly implications for commerce, especially for companies with business overseas.
Consider the current negotiations to create a "Privacy Shield" in place of the Safe Harbor arrangement under which companies were allowed to process and store the personal information of European data subjects in the US. Last October that agreement was deemed inadequate by the Court of Justice of the European Union (CJEU) because, in light of the Snowden revelations, the US appeared incapable of adequately protecting that information from surveillance by its own intelligence services (notably the NSA but also the FBI). US negotiators striving to put the new Privacy Shield in place are arguing that the US understands and respects the privacy concerns of its EU trading partners. I think that argument is harder to make if Apple loses.
Something that could get easier if Apple loses, is the poaching of US business by companies and countries that can convince customers they are above the privacy-security fray. Some of that convincing is specious, given that the privacy laws in many countries include exemptions for law enforcement and national security interest.
But the fact remains that the US does not currently offer its own citizens the same broad data protection that residents of EU countries enjoy. America’s alternative, a patchwork of statutes and case law, once acclaimed for fostering innovation, may end up smothering commerce if it gets torn again, by an FBI win in this case, or the next case (a case involving the hypothetical but inevitable arrival of phones that even the manufacturer cannot unlock).
Stephen Cobb has been researching information assurance and data privacy for more than 20 years, advising government agencies and some of the world's largest companies on information security strategy. Cobb also co-founded two successful IT security firms that were acquired by publicly-traded companies and is the author of several books and hundreds of articles on information assurance. He has been a Certified Information System Security Professional since 1996 and is based in San Diego as part of the ESET global research team.