I feel inadequate to write about this. Far smarter people far closer to the problem have written far better things than I ever can. But this is one of those situations where, once you have all the knowledge, I literally cannot see the other side of the argument.


Basic facts

  1. The FBI recovered an iPhone from the San Bernadino shooter that’s locked with a passcode.
  2. iPhones that have passcode locks are encrypted, meaning the data is not easily available simply by hooking up the phone to a computer. The operating system takes the passcode and runs it against an algorithm in order to access the data when the phone is used. If you were to try to look at the data without using the passcode, it would look like gibberish.
  3. iPhone passcodes have multiple layers of protection. After 10 incorrect attempts, all the data on the phone is erased. Additionally, the passcode must be input physically — you cannot unlock the phone digitally.

The FBI wants the information on that iPhone (even though there’s probably nothing of value on it). The FBI claims it has no way to access the data absent Apple’s assistance.1 This is unlikely to be true, though it is true that there’s no easy way for the FBI to access the data (after the FBI already messed up the easy to do it2).

The FBI’s solution to this is for Apple to “just” write some software that allows them easy access to the phone. That’s all! And they promise they’ll only use it in this case and never again and what’s the big deal?

The technical side

There are a few ways that Apple could help with this order. They are:

  1. Write software that would be side-loaded onto the phone that allows for software-based passcode entry and doesn’t automatically delete the data. The FBI could actually do this themselves, but Apple phones only allow signed iOS updates that they control — otherwise anyone could steal your iPhone, upload a cracked OS and then access all your data.
  2. Write a new version of iOS that allows for a “master key” decryption, where there’s a known string of numbers or letters or something that automatically un-encrypts a phone.

On the surface, these sound reasonable. Apple does some work (6-10 engineers for a month for the first option), the FBI brings the phone to Apple, Apple loads the software and poof! Fixed!

This is the FBI’s contention.

There are a few problems with this, the first and foremost being that once the ability to do this exists, it exists forever. Apple would of course guard this code with the strictest security, but no security is foolproof. It’s entirely possible the software could get out, and once it’s out, it’s out forever.

The second problem is that there’s absolutely nothing about this that limits it to one phone, as evidence by numerous law enforcement professionals who are literally waiting to see if Apple is forced to do this to do the same thing, hundreds of time. Oh, and some didn’t even bother to wait.

The actual big deal

Security is binary. Either something is secure, or it isn’t. Cracking one iPhone by design means that all iPhones of that same model have been cracked. Once it’s been done, there’s no way to un-crack it, short of releasing a new OS. And if the FBI forced them to write one backdoor, you’d better believe they’ll come knocking again.

We as a country went through this before, with encryption. In the 90s, the government wanted to make sure there was a way it could decrypt voice conversations. Of course, the only way to do this was via a backdoor. Once that backdoor was created, someone with enough time, computing power and skills could figure it out. And they did. If a backdoor exists, a system is not secure.

iPhones do seem different, I’ll agree. It’s theoretically possible that the software could be locked up tight on Apple’s campus and no bad actors could ever gain access to it. No worries about stolen iPhones and thieves getting your personal information.

But it doesn’t stop at iPhones. Already the government is asking WhatsApp to insert a backdoor into its end-to-end decryption (meaning the company can’t even decrypt the messages). And again, once you put a backdoor in, anyone can exploit it. What about when they need a Facebook chat? Or a Google Hangout?

Then there’s the other bad actors in the room: What about Iran? North Korea, Russia, Syria, Egypt, any place with a dictator and a lot of citizens. They could demand the same access and use it to spy on dissidents, opposition parties or whoever the hell they want. And hey, the US could do demand it — why can’t they?

I legitimately can’t see the other side on this one.

Terrorists are bad. ✓

We should do everything in our power, legally, to stop them. ✓

The government doesn’t get carte blanche access to whatever it wants from you simply because it’s the government and it thinks you might be bad. ✓

P.S. Also, we’re going to ignore the, “Well, only bad people have to worry about it!” strawman, because it is a strawman and my hay fever prevents the consideration of silly ideas. Bad people are defined as whoever the people in charge of the system think are bad. And sometimes not even then — remember the NSA’s “love taps”?

  1. I’m fairly certain there are several ways for the FBI to do this, they’re just labor-intensive. Among them, the FBI could clone the phone/drive, enter in 10 passcodes, then (when the data is destroyed), move to the clone and try it. Every time a phone gets erased, you re-clone it.

  2. The FBI claims there would be more data than the backup, but I’m pretty sure, “Well, there’s more evidence elsewhere” is not a great excuse for destroying other evidence.