Why Apple is making a stand against the FBI

Companies cannot meaningfully offer security and privacy measures to their customers if they are simultaneously forced to subvert them for governments, with all the risks involved. By Eerke Boiten.

Tim Cook portait, by  Thierry Ehrmann

Tim Cook painted portait, by Thierry Ehrmann

Apple has been ordered to help FBI investigators access data on the phone belonging to San Bernardino gunman Syed Rizwan Farook. The technical solution proposed by the FBI appears to undermine Apple’s earlier claim that they would be unable to help. However, in a strongly worded reply, Apple CEO Tim Cook has indicated that Apple is unwilling to comply with this order, as it would do irreparable damage to all iPhone owners’ security and privacy.

On newer Apple phones like Farook’s (an iPhone 5c, running iOS 9 according to the court motion), data stored on the phone is protected by encryption, using the passcode (which is also used for unlocking the phone) as part of the key. (This is a different issue from “end-to-end encryption”, which concerns iMessages when they are in transit between phones.)

Apple recently claimed that they were unable to decrypt such information at all, as they do not have the passcode. This is also the line it takes in its policy statement on providing information to governments, first posted in May 2014.

Blocking brute force

In the court order, the FBI is undermining this claim. The FBI claims that Apple can write and run software that can help discover the passcode and access Farook’s data. The software should switch off security features that currently prevent a “brute force attack” — trying all possible passcodes — which should take little time if the passcode is “numerical” as claimed by the FBI.

One of these security features is an enforced increasing delay between repeated passcode attempts, which would make brute force attempts take too excessive a time. The other defence against brute force is auto-erasure: if this is switched on (as appears likely), after 10 failed attempts, the data on the phone is effectively erased.

Finally, to enable automation of the brute force attack, the FBI is asking for a method to enter passcodes electronically. With all of this, the FBI has been careful to point out that it would not be attempting to break encryption — but merely asking Apple to remove security measures that get in the way of the FBI discovering the key.

A message to its customers

In its response, Apple has not gone down the road of claiming that the suggested approach will not work. This may be an indication that it could actually work, but also a deliberate choice to focus the argument elsewhere. Crucially, Cook’s response appears targeted at the public: it’s headed “A Message to Our Customers”. This is in line with Apple’s general marketing, which emphasises privacy as a selling point.

Cook stresses how dangerous the proposed software would be for personal security. iPhone encryption now protects all sorts of important personal data, and once software like this exists, it could end up in the wrong hands or be used for much wider purposes by governments. Essentially, it would weaken encryption permanently for Apple’s customers.

Cook also makes it clear that Apple has no sympathy with terrorists — but points out they will always be able to find more secure methods if Apple’s security is weakened. The FBI’s argument that the software would be less risky by only being built for the specific phone is also quickly dismissed.

Making a stand

Cook makes a strong stand against this and other so-called backdoor methods of accessing a phone’s data. He argues that companies should not be asked to systematically undermine the security they build into their products.

This echoes Apple’s recently submitted evidence on the UK draft Investigatory Powers Bill. There, Apple strongly resisted being asked to assist in “bulk equipment interference” and “removal of electronic protection”, either of which could allow UK intelligence and law enforcement to make requests very similar to the FBI’s. IT lawyer Neil Brown even suggested that the UK has a similar law to the 1789 All Writs Act invoked to justify this US case.

The nominal audience may have been Apple’s customers, but really Cook chose to make a stand against governments. The argument that encryption is a necessary component of security and privacy in a modern society appeared to be won already. However, companies cannot meaningfully offer security and privacy measures to their customers if they are simultaneously forced to subvert them for governments, with all the risks involved.

Apple is demanding the right to operate in the market it is carving out for itself. We may feel the company has common sense on their side, but it is clear that the battle over the legal position is far from over.The Conversation

  • Eerke Boiten is senior lecturer, School of Computing and director of Academic Centre of Excellence in Cyber Security Research, University of Kent
  • This article was originally published on The Conversation

Share this article

  • Greg Mahlknecht

    While I agree 100% with Apple’s stand on this, I think the FBI have outplayed Apple this round. I don’t believe Tim Cook’s reply is very convincing. Will be interesting to see where this goes next.

  • baasted_123

    Google is also against this. So if the 2 biggest companies in the world combine forces I do not see them loosing this.

  • Greg Mahlknecht

    The size of the company doesn’t matter … remember what happened to MS Antitrust? And Google’s lost privacy cases in France and Germany in recent years.

    My point was that Apple seem to have put all their focus in to pushing their “we can’t decrypt it” defence, and the FBI came back with “okay, no problem, we don’t want that”. I think Apple overlooked this backdoor tactic the FBI is pursuing.

    They are asking for something that would be absolutely trivial for Apple to supply.

  • Dave Baker

    It unfortunately is the wrong battleground…
    The national security argument tugs at insecurities.
    Apple cant play that game.

    Did you hear that the FBI botched it by getting the owner of the phone to change the iCloud password?
    That means that Apple cant force a backup to iCloud without the passcode.

    Apple have made the iCloud backups available ito of the warrant.
    So if the FBI hadn’t done that they could have gotten to cost of the data via the backups.

  • Greg Mahlknecht

    I agree that FBI don’t care about the phone (the case is pretty much closed and solved, which is why they can take their time to run this principled battle to its close), but Apple are fighting this really badly, and it’s being HORRIBLY reported. I’m hearing such ridiculous things like “They’re forcing Apple to become a forensics company” … no, this will simply be a few #define’s in the iOS source, they make a build – they don’t even need to sign it as they could jailbreak the phone in question – and give it to FBI (or invite the FBI in and let them work on the phone in their lab). Apple will already have any number of diagnostic builds internally that hackers would love to get hold of, that they haven’t yet.

    This isn’t a backdoor. It’ll really just be a “diagnostic build” as we call it in the coding world. In fact, over time iOS has had 2 vulnerabilities that allowed exactly what FBI are asking for – first was one could brute-force the lockscreen using a Bluetooth keyboard, the second was a hardware shim wired in to the iPhone that entered the passcode and reset the phone if the code was wrong, before it could lock up. So Apple has unwittingly already done this “backdoor” as they call it twice, and the world didn’t end like Tim Cook said it would.

    I’m not agreeing with the fact FBI should be able to do this, but I feel Tim’s reply was very weak and didn’t convince me, I wish they’d come with a more compelling argument for the sake of all of us, because I can see from a million miles away it’s total BS, and might not be enough to convince a judge.

  • Greg Mahlknecht

    It’s a one-off build for Apple, that can’t get out in to the wild, and is not exploitable like everyone seems to think it is. Watch the last Security Now podcast – Steve Gibson goes in to good detail, and uses information from Apple’s own security whitepapers to explain why this isn’t a backdoor, but a custom once-off build. And the FBI has said they don’t want this build, they’re happy for Apple to put the build on to the phone under their control, and only work on it under supervision inside the Apple Campus.

    >Tim’s reply was for the layman

    This isn’t the reason he was wrong. He was wrong because the points he was making in layman’s terms were simple ones, and refuted by his own company’s Security Whitepaper. As I mentioned, there have already been backdoors in iOS that allow this kind of brute force attack, and as the whitepaper explains, Apple has built in controls to ensure that hackers or the FBI can’t downgrade back to these versions to brute-force it.

    > Apple has gone to secure the iPhone read this.

    Ironically, that’s the document Steve Gibson uses to prove this request isn’t the threat Tim Cook is making it out to be.

    Of course, if Apple maintains their position that this custom build could leak and be a backdoor, they are admitting GIGANTIC holes in the iOS security, which would allow hackers to use the older hardware/Bluetooth hacks to achieve the same goal.

  • Greg Mahlknecht

    >This has never been about whether they can break into their own phone, it’s about whether they should be forced to.

    We can agree here, and therein lies the problem – Apple has chosen to fight this on the principled “if we do this, government MIGHT ask us to do it more in the future” approach, and that’s a very weak legal argument. I have no idea how one would respond to the FBI’s request on legal grounds to refute it, and Apple’s lawyers seem to also be stumped.

  • Richard Wickens

    You are still missing the point, I am not talking about the feasibility of breaking the security of the phone. Obviously if they have the source code they can make the changes being requested to get the encryption key. It’s more about the legal precedent it will set.

    I work in IT as a developer, ALL code (beyond Hello World perhaps) has vulnerabilities. It’s just a question of finding them and exploiting them. Take Linux for example, for years it was touted as the most secure because there were so few viruses etc. for it, but now that it’s becoming more main stream (thanks in part to Android) all these new vulnerabilities are being found and exploited (heartbleed, shellshock, ghost) because the black hats are turning their attention to looking for them, since now it’s actually worthwhile to do so. There is no value in hacking an operating system that is not widely in use.

  • Greg Mahlknecht

    >It’s more about the legal precedent it will set.

    Agreed. But the technicals are important, because that’s what Apple is arguing with.

    >Obviously if they have the source code they can make the changes being requested to get the encryption key

    Yes, but Cook’s argument is that this is absolutely the start of a slippery slope. Unless there are catastrophic breaches and leaks in Apple security, it really isn’t. And if there were a breach of that magnitude at Apple, leaking this build would be the least of anyone’s worries.

    I’m on Apple’s side, I just don’t agree with their argument.