Carolyn Kaster / APIn case you haven't heard, Apple announced Tuesday that it's refusing to write code for the FBI that would unlock one of the San Bernardino shooters' iPhones.
CEO Tim Cook said in a post on the company's website that doing so could undermine security for all iPhones:
The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.
But this raises an interesting question. Why does the government need Apple's help at all? Sure, the manufacturer knows the product best. But the FBI is part of the same government as the NSA. And the NSA built Stuxnet, a bit of code that wormed its way around the world to destroy air-gaped nuclear machinery in Iran. Surely someone there could disable a four-digit passcode on a consumer phone?
It's not so simple.
The four-digit passcode and 'brute force'
Alex Heath / Tech Insider
The four-digit passcode on an iPhone isn't just a flimsy locked door that a sufficiently clever hacker could work around. It's the only way into the vault.
That's because most of the important information on your iPhone is encrypted. The four-digit passcode, along with a key built into the phone's hardware, is necessary to de-crypt the messages and photos on your phone. Without it, they just read as garbled data.
Apple is better able to access that data than the FBI because it holds a set of keys. Those keys won't unlock the vault, but they do allow it to access and modify the vault's locking mechanism — and to make changes that would allow the FBI to pick the lock on its own.
Right now, if you enter an incorrect passcode on an iPhone 5c (the model owned by the shooter) you just don't get in. Make some more mistakes and iOS will make you wait longer and longer between each attempt. Screw up enough times and your phone will erase its data. This protects your iPhone from something called a "brute-force attack," a hacking attempt in which a computer sends thousands and thousands of possible pass codes to the iPhone until it lands on the correct one.
The FBI's order demands that Apple create a firmware update that would remove these safeguards and allow a brute force attack. As security researcher Dan Guido writes at Trail of Bits, only Apple has the secret keys necessary to make an iPhone accept a firmware update. That's why the FBI or NSA can't do it on its own.
Why Tim Cook refuses
The FBI's request indicates Apple could create a firmware update that would only work on the specific iPhone involved in this case. But Apple CEO Tim Cook writes that such an exploit would lay out a route for invading many other Apple devices:
The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
Interestingly, in the past Cook has said that even if Apple wanted to build a backdoor into encrypted Apple data, the company couldn't. He wasn't talking specifically about four-digit pass codes, but it's interesting that in this case he believes a backdoor is possible.
What's really going on here
Even if following the FBI's demand wouldn't create a "master key" to hundreds of millions of devices, Cook has good reason to oppose it.
Cook has positioned himself and his company as stewards of their customer's security. Giving in to the FBI's demand here might set a precedent that would make it harder for Apple to say no in the future.
Both Apple and the FBI are competing for the moral high ground in a war over the ability to access your data. The FBI wants to use the case of one terrorist's data to grab a win and set a precedent in its favor. Apple wants normal people to see it as an issue concerning their data and personal privacy as well.
Are newer iPhones susceptible? That's an open question
Antonio Villas-Boas/Tech Insider
At Trail of Bits, Guido also writes that a security feature present on iPhones newer than the 5c might render this hack unusable.
Starting with the iPhone 5s and the Touch ID fingerprint sensor, Apple has built something called a "Secure Enclave" into its smartphones. It's a separate computer built into the device's hardware. The Secure Enclave holds a second passcode for the vault door into the encrypted phone data — and both are necessary for it to open.
The enclave watches you as you input your passcode. If you keep getting it wrong, it makes you wait longer and longer before putting in its own passcode. No iOS modification can make the enclave cooperate — it runs its own, separate code.
Guido suggests that the Secure Enclave may be an impassible obstacle to the kind of hack the FBI wants on future devices:
Although this feature is not described in Apple iOS Security Guide, it is widely believed that updates to the Secure Enclave wipe all existing keys stored within it. An update to the Secure Enclave is therefore equivalent to erasing the device.
In other words, the Secure Enclave is functionally tamper-proof.
However, not everyone agrees. John Kelley, whose LinkedIn states he worked as an "Embedded Security Engineer" at Apple from 2008 to 2013, tweeted that the Secure Enclave is as susceptible to tampering as any device:
@AriX I have no clue where they got the idea that changing SPE firmware will destroy keys. SPE FW is just a signed blob on iOS System Part— John Kelley (@JohnHedge) February 17, 2016
Protecting your iPhone
One important fact in this debate is that the shooter used a four-digit passcode to protect the iPhone. Brute-force attacks become more difficult, and take more time, with longer and more complicated passcodes.
The six-digit passcode introduced in iOS 9 is much more secure than using four digits. Where a computer could guess a four-digit code with a brute force attack in no more than five days, a six-digit passcode might take as long as 458 days. An alphanumeric code takes even longer to crack.