Apple to give FBI backdoor to the iPhone…
Coming soon to cinemas everywhere
In years to come I’m almost positive that we can expect a film adaptation of certain recent events. Like a weird mash up perhaps of The Social Network (2010), The Matrix (1999), Hackers (1995), and.. I don’t know.. Welcome to Macintosh (yes that’s a real film too, made in 2008) – Yes I’m referring to the real life story of.. When the Valiant Feds Bit into the Poison Apple, or perhaps it is.. The Day the Tech Resistance Stood Strong against the Evil Sith State?
Whatever they call it, it doesn’t matter, I’m confident that it will be terrible.
So what’s the fuss all about?
On December 2, 2015, a mass shooting and attempted bombing in San Bernardino, California left 14 dead and 22 seriously injured. Within four hours a search, car chase and shootout ended in the perpetrators being shot and killed. Within four days the shooting had been declared an act of terrorism.
There is no question that the events of that day were an enormous tragedy, and it is quite right that we should expect that anyone asked should do everything within their power to aid the investigation.
One such anyone is Apple – an iPhone 5c used by one of the attackers was recovered, and Apple have been ordered to help crack it open. But they don’t want to.
That’s not OK! Is it?
Whatever your current thoughts, assuming you have any, regarding the ongoing saga between the FBI and DoJ, and their apparent nemesis The Big Evil Apple Corporation (who only care about profits and shareholders, by the way, and definitely don’t care about you, the consumer) – I think it is worth taking a large enough step back to note something:
- There are a lot of very very intelligent people weighing in with some very strong opinions on this matter, so surely there’s a chance that at least some of them might have a point? Oh and there’s the FBI too. They have guns and badges and stuff.
Disclaimer – I do have an opinion. I shall try my best to present information in a well-balanced manner, though one side really isn’t giving us a great deal to work with.
Nope. I don’t get it.
I’m not surprised. Firstly I haven’t actually told you anything yet, and secondly this case is massively massively more complex than it appears at first look.
OK, so we are told that the married couple that carried out the attack were what is being called “home-grown violent extremists”. That is they were self-radicalised copy cats, inspired by but not actually linked to or directed by any extremist group. They were not carrying out some larger group’s masterplan, they had one other accomplice who has already been arrested and charged with supplying the rifles used in the attack, among other things.
In cooperation with the FBI, Apple already supplied all data that was backed up from the iPhone to the iCloud service, so the data in question is everything since the last backup to iCloud.
Syed Rizwan Farook, the owner of the recovered iPhone, had other phones. One, the iPhone in question, was issued by his employer – San Bernardino County. Two others were personal devices, which he destroyed. Realistically, on which device(s) were communications of a terrorist nature likely to have been made?
Don’t worry, I’m not suggesting there’d be no value in accessing the data on the iPhone, just trying to put that value into perspective.
Well then what’s your point? We should look anyway.
Oh yes, absolutely. But ever since iOS 8 Apple improved security significantly, it’s a major selling point for the device – all data is encrypted, and nobody, not even Apple, has access to the decryption key. Your data is not accessible until you have entered the correct unlock code. It’s a short simple code, but if you enter it wrong too many times your device automatically blanks itself and then it’s game over, so you can’t try to brute force it (that is, try every combination until you find the right one).
However in this case, getting around the passcode should have been easy. Really easy. Any device owned by or accessing data belonging to any government, business, or educational entity should be secured, managed and maintained using something called a Mobile Device Management (MDM) service. If not, you have no control over them, and that is never OK. Most importantly though, it would have allowed San Bernardino County to, at the FBI’s request, remotely clear the device’s passcode. In seconds. San Bernardino County even have a contract for exactly that technology, but with no countywide policy the particular department had simply chosen not to implement it.
Apple proposed that if they could cause the iPhone to perform an auto-backup to iCloud, all of the remaining data could be recovered that way. Yet for some inexplicable reason, it appears that less than 24 hours after taking possession of the phone the FBI asked San Bernardino County IT to change the password associated with the iCloud account. This meant that the phone could not perform a backup without first being unlocked for the new password to be entered – as we’ve already covered, not an option.
So what are Apple actually being ordered to do?
I think more articles have focused on this than on any other point, so I’ll try to be brief. The FBI want Apple to produce a “one-off” iOS Software Image File, that can be forced onto the iPhone in question and will break one of the most important safety features of the device – the limitation on incorrect lock code inputs. Now this is very easy to do, but only Apple themselves can do it since the iPhone only accept a software update if it is verified by Apple’s digital signature
And why is that a problem?
First let’s recap. We have a small independent terrorist cell, who we are very confident are all already dead or imprisoned. We have a phone, which we already have most of the data off, leaving only a relatively small amount to recover. We know that there is unlikely to be anything useful there, since two other devices have been destroyed which were far more likely to have been used in association with the terrorist activities. We have two missed opportunities to recover that data due to failings within the government and the FBI. So now we’ve made it Apple’s problem.
This backdoor does not currently exist, and as stated above only Apple can create it. There are two main reasons that this should never be allowed to happen:
The first is what happens if it ever gets leaked?
The order does state that it should be coded with a unique identifier for the device in such a way that it can only be used on the subject device, however code gets hacked and vulnerabilities exploited all the time. iPhones would certainly be more vulnerable with a dangerous piece of signed software around that hackers only need to find a way to change the identifier in order to abuse it, than if that software continues to simply not exist.
The second is what happens when the FBI come knocking again?
The FBI claim this is a one-off request due to exceptional circumstances. But it is totally inconceivable that if or when another exceptional circumstance arises, maybe something much worse, they will stop and say “actually we did promise we wouldn’t ask again” – once the precedent has been set of course they will use it again!
This is where people start to pose a very serious question:
if the gov hacks the iPhone themselves, they don’t get the legal precedent they are so desperate to establish in this case.
— Christopher Soghoian (@csoghoian) February 17, 2016
Americans are more afraid of a terrorist attack on their home soil than almost anything else in the world. This would certainly not be the first time that the US Government has been accused of attempting to use this fear to increase their power to snoop.
There is one more major problem with all this. It won’t work.
Last year there was a lot of talk about insisting Governments should be able to demand access to all encrypted transmissions “for our safety”. That was ridiculous, and this feels a lot like the watered down Plan B. The problem is encryption simply doesn’t work like that. You can’t stop people encrypting. There’s no way to control it, or force people to only use algorithms with a backdoor built in. You could even outlaw encryption entirely, but like many banned substances I can think of, those who want it and don’t mind breaking the law would still be able to do it – and last I checked, terrorists fit that bill.
The situation this would create would be one where everybody would be a bit less safe, with a bit less privacy. Apart from those with something to hide. They would now know they can’t rely on Apple (or any mainstream software vendor) to keep their data encrypted, so they’d look elsewhere and then carry on exactly as they were.