In December 2015 a terror attack was carried out in San Bernardino by two radical islamofascists. In the ensuing investigation it was discovered that an iPhone of one of the attackers was recovered. This iPhone then became the target of a FBI investigation in order to retrieve the data off it. The FBI was ultimately successful, today we’ll detail the process by way this could have happened.
Apple was willing to help by providing an iCloud Backup of the device to the FBI. This however, was not possible due to the fact that the FBI had asked San Bernardino County (the owner of the phone) to change the iCloud password. This made the device require passcode entry in order to run a backup to iCloud. As a result the FBI asked Apple for help in accessing the data, they refused and what was going to be a lengthy court process began.
Apple and many other tech companies appealed to the public for support, and rightly so. Modern encryption cannot be strong with the presence of any backdoor. While Apple could retain the keys to the hypothetical backdoor it is possible that a rogue employee could use the key for ill or provide it to law enforcement. Or what’s worse, Apple could be infiltrated by one of the three letter agencies covertly.
The other problem with creating a backdoor relates to foreign powers. While we in North America and Western Europe have many liberties and the authorities are bound by the same laws they enforce this is not true in other parts of the world (Russia, the Middle East, China, etc.) By creating a backdoor Apple would’ve also had to comply with requests from less desirable nations if they wanted to continue doing business in those regions (China is one of Apple’s largest growing marketplaces). What’s more is that the rules that would be set in place by these foreign powers might not serve the same ends that the FBI was seeking in the United States. As a result Apple had further reasoning to hold its position.
In the last year or so Apple has been making user privacy central to its marketing platform. Apple said numerous times with the release of iOS 9 and the iPhone 6s (and 6s Plus) that they are the only company who does not send copious amounts of data over the internet to provide virtual assistant services (Siri, Cortana, Google Now). They also love to mention that iMessage is encrypted and only your device can read sent messages (if anyone wants to know more about iMessage encryption, PM me!). Finally, every device since the iPhone 5s is secured by way of encryption whenever it is locked or powered down. Apple loves being the company of user privacy by way of employing strong encryption.
So while Apple had valid technical reasons for resisting, they also had multiple marketing and monetary reasons to resist the FBI. The FBI meanwhile wanted this case to serve as a precedent for future digital encryption cases. If the FBI had won the case they would have been able to request devices in the future be unlocked should they have been a part of a criminal case. Both parties had purported reasons for their actions. On Monday March 28th the FBI announced that they had successfully contracted with an outside firm to unlock the device and the court case was dropped.
How did this happen? And is my device vulnerable now?
The iPhone 5c (the model in question) is a 32-bit device and stores its encryption keys in local storage next to the operating system and user data. The iPhone 5s and newer operates on a 64-bit platform compared to the 5c’s 32-bit platform. The 5s and newer also stores encryption keys and other sensitive user created data in a “secure enclave” on the CPU itself. This means that it is not possible for code running on an iPhone 5s or newer to access the secure enclave without the device being decrypted first (i.e. using TouchID or entering a passcode). Now you may be asking why does it matter if the chip is 32-bit or 64-bit? The 64-bit capable CPU is fitted with the capability to access the encrypted data almost instantly.
This means that out of pure luck that the phone in question was a 5c and not newer meant that this data was accessible in any easy fashion at all. It had been theorized by those in the iOS Jailbreak community that it would be trivial to remove the passcode and access the data on an iPhone 5c. Had the phone been a 5s it is highly likely that we would be witnessing court proceedings rather than hearing the news that the phone’s data had been recovered.
If you have an iPhone 5c or older, then yes, in theory your device could be broken by law enforcement. It is recommended that if you are at all concerned about your phone falling into the hands of those who may want your data that you move to a 6-digit passcode or an alpha-numeric password. This will prevent casual attempts to break the password but does not defend against the theorized methods used by the contracted Israeli firm. If you have an iPhone 5s or newer your device is probably impenetrable. However, as with all security matters we don’t know how secure a piece of hardware or software is until it is extensively probed.
That’s how this story ended and how a lengthy court battle was averted. I’ll be putting together a further piece if we get a definitive answer on how the iPhone 5c in question was broken. If you have further questions about iOS security or the security of iMessage please send me a private message and we may be able to orchestrate an article on the matter!