We rely on our smartphones, tablets, and computers, so digital security matters to us whether we know anything about it or not. But it’s also tough to understand: we have little choice but to trust that when a company says it’s doing everything it can to keep our data and information secure, they’re actually doing it. They’re the experts, right? You know, like Target. And Adobe. And Yahoo. And Facebook. And many, many others.
Apple is not immune to security problems (it just patched a huge SSL bug in iOS and OS X – if you haven’t updated, back up and do it now). But unlike other big tech players, the company has published a detailed overview of its security measures, answering key questions about how Apple secures users’ passwords, data, and messages, and devices – an unusually public statement from such a famously secretive company.
The upshot: Apple takes this stuff very seriously – and perhaps differently than other companies. Here are a few examples.
The (private) keys are in your hands
Much of Apple’s security infrastructure relies on public key cryptography, also called asymmetric cryptography – a widely-accepted idea that’s been around since the 1970s. (Read up on how public key encryption works here.)
Even if someone cracks Apple’s servers, Apple probably won’t have much (or any) iMessage data to turn over.
Well, no. It turns out Apple only has the public keys for services like iMessage and FaceTime, but the private keys never leave a particular iOS device. Apple uses those public keys to encrypt every iMessage separately for every device (and only that device). Further, Apple deletes iMessages once they’re successfully delivered (or after seven days if they’re not received) so they don’t linger long on Apple’s servers. (Photos and long messages get encrypted separately, subject to the same deletion rules.) That means even if someone cracks Apple’s servers (or a government serves them a subpoena), Apple probably won’t have much (or any) iMessage data to turn over. Apple also alerts users immediately when a new device is added to their account, hopefully preventing someone from illicitly adding a device so they can receive their own copies of your messages.
What about your Keychain?
Apple’s iCloud keychain handles sensitive data – like passwords and credit card numbers – and keeps them synchronized between devices. So iCloud must keep a copy of that data to do the syncing, right? Well, no.
Apple uses a similar public-keys-only method to synchronize Keychain items. Apple encrypts each item separately for each device, and Apple only syncs one item at a time as needed, making it very difficult for an attacker to capture all your Keychain data, even if Apple’s core system was compromised. To get your Keychain, an attacker would need both your iCloud password and one of your approved devices to add one of their own – along with fervent prayers you never see those notices Apple sends immediately when a new device is added.
Okay, so what about the optional iCloud Keychain Recovery? Apple must have all your Keychain data in order to restore it all, right? Well, yes. But Apple’s done something clever here too. By default, Apple encrypts Keychain Recovery data with Hardware Security Modules (HSMs), hardened devices used by banks and governments to handle encryption tasks. Apple has programmed the HSMs to delete your data after ten failed attempts to access it. (Before that, users have to contact Apple directly before making more attempts.) To prevent anyone from reprogramming the HSMs to change their behavior, Apple says it has destroyed the administrative access cards that allow firmware changes.
Even Apple can’t change the system without physically replacing whole clusters of HSMs in their data centers – which is a pretty intense physical security barrier for would-be attackers. And even if they pulled that off, the attack would only work on newly-stored Keychains: existing ones would still be safe.
Lightning in a bottle
Apple has confirmed long-standing suspicions that manufacturers in Apple’s Made for iPhone program must include a cryptographic circuit supplied by Apple for Bluetooth, Wi-Fi, or Lightning access to iOS devices. The circuit proves a device is authorized by Apple; without it, iOS accessories are limited to analog audio and audio playback controls: enough for speakers, but no access your apps or data. Some might argue this custom chip is an example of Apple forcing you to buy its own products, but it also means the odds are very low that plugging in somewhere to charge your device will compromise its security.
Tip of the iceberg
Apple’s white paper discusses many other technologies like Siri (including how long Apple holds on to data), the 64-bit A7 processor, and the iPhone 5S’s TouchID feature (Apple estimates the odds of a random fingerprint matching yours are about 1 in 50,000), and how apps and data are secured within iOS itself. Security experts will be pondering the contents for a long time.
Some might argue this custom chip is an example of Apple forcing you to buy its own products.
Apple’s paper is a solid step forward. One could hope it will inspire other companies to detail how they keep users’ data secure – but I wouldn’t hold my breath.