You might think that this case is an easy one, that Apple wants to protect its customers’ privacy and the government doesn’t, that Apple is right and the FBI is not. Well, it’s not that simple.
First, let me provide a little bit of context:
- On December 2, 2015, Syed Rizwan Farook and his wife Tafsheen Malik shot and killed 14 people and injured 22 others at the Inland Regional Center in San Bernardino, California.
- The FBI recovered an iPhone 5c, issued by Farook’s employer, which “may contain critical communications and data prior to and around the time of the shooting“.
- The FBI obtained a warrant to search the iPhone, and the owner of the iPhone gave the FBI its consent.
- The iPhone is locked and the FBI asked Apple to help execute the search warrant.
Apple refused on a very long letter written by CEO Tim Cook (full text here). Here’s a little extract:
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
After reading that letter, I concluded that Apple was right, but after a discussion with a good friend, I realized that my conclusion was too simplistic.
He suggested I read the original Court Order filed by the U.S. Attorney’s Office in Los Angeles (full text here), where I found this:
In fact, Apple has previously complied with similar data search warrants. Take a look at section “I. Extracting Data from Passcode Locked iOS Devices” from Apple’s Legal Process Guidelines. This is not a secret and shouldn’t be a surprise for anyone either, since the same applies to other big tech companies.
The difference this time is that Apple simply has not implemented the functionality required to specifically crack devices running iOS 8.0 or later versions. Would it be risky to create that functionality? Yes, but the same can be said of many other security-related developments.
We should find a solution that protects the privacy rights of Apple’s customers, and at the same time complies with the law.
Congressman Ted Lieu made a statement about this that I find enlightening:
This FBI court order, by compelling a private sector company to write new software, is essentially making that company an arm of law-enforcement. Private sector companies are not—and should not be—an arm of government or law enforcement.
I agree with that: any requests from the Department of Justice to the private sector should be backed up by proper and explicit legislation, not the extremely vague text from the 28 U.S. Code § 1651, written in the year 1789.
In any case, and given that the Apple vs FBI case resolution will have deep implications for the tech industry, I just hope we can come out of this as a better society.