Posted 24th February 2016 | By ExtraMile

I’m sure you’ll have seen this case over the news in the past week that a federal judge has ordered Apple to comply with an FBI request to assist in breaking the encryption on an iPhone owned by one of the perpetrators in the San Bernardino shooting. Apple have refused the order and have posted an official response from CEO, Tim Cook.

Ignoring the obvious issues the US government has with cyber communication (see the wikileaks scandal) is Apple right to take the stance it has? Why are they refusing to co-operate with the FBI in a case where vital anti-terrorist information could be uncovered with their help. After all they’re claiming to be protecting their customer base but if their customers have nothing to hide it shouldn’t matter, should it?

It’s the same argument used for all sorts of changes made under the name of security. It’s a common refrain, for example, when talking about CCTV and it’s installation across the globe. CCTV is inherently a ‘good thing’, so the argument goes, and thus the only people who should worry about it are those who’ve got something to hide. Tell that to the victim of Ciaran McCleave, a CCTV operator for the Police Service of Northern Ireland (PSNI) who used a camera that was supposed to be monitoring an interface area of Belfast to spy through the window of a first floor apartment over the course of 26 days, at one point filming the young woman inside topless. She had nothing to hide and believed that the privacy of her own home was just that, private! She was victimised by the very thing that was supposed to keep her safe.

Now that’s one man using a technology, that has obvious advantages in a Sectarian flashpoint area, for his own inappropriate purposes whereas the issue at question with the San Bernardino case is one instance for a specific phone. Or is it? Realistically once the FBI realise they can make this request and enforce technology companies to modify their software to allow them access is it likely to be a one time thing? Or will they be making this request to Google, Windows, Apple etc. every time there is an encrypted device linked to terrorist activity.

It’s an obvious step to make. The problem is that the more prevalent something becomes, the more likely it is that it will end up in the hands of someone who is less than trustworthy (or has had their head turned by the opportunity to make a lot of money). For example, the first CCTV operative was probably very unlikely to think he could use it for his own gratification – CCTV became more prevalent, the number of operatives increased and you end up with the case like above.

Cyber crime is big business – from international organized crime groups to state-sponsored hacking there is money to be made. Once we’ve got a few instances where these ‘one-off’ back-door access options are created for the FBI, CIA, NSA etc. (and there will be more than this one request) it wouldn't take much for that software to be stolen or leaked. And if Apple are forced to do it for the US government agencies it will be quickly followed up with requests made by other countries for access to the tool – it’s one thing the FBI asking Apple to trust that this tool will only be used by them but it sets a precedent that essentially forces Apple to accede to the request for a backdoor into their software in any nation that they sell their devices.

Once it’s outside of the protective bubble of trusted organisations then it will become a free for all – all of the data stored on any Apple device will be easily accessible to those with reasons to access that data, whether it’s law enforcement or criminals.

Let’s not forget, the reason Apple has installed that level of encryption on their devices is so that they can protect their users against cyber crime. It’s no surprise that the vast majority of cyber crime is achieved through social engineering i.e. working out enough about the particular person to figure out a password and then building out from there. It’s now much easier to do that then to try to brute-force an account, precisely because of the level of security the tech firms are putting in place.  

There is the argument (that the FBI are using) that there needs to be a compromise between privacy and security but there aren’t any compromises with security. The minute there is any gap in a security system then it’s no longer secure – it’s like locking all the doors to your house when you go out but leaving the key underneath the door mat and then being surprised when your house has been burgled. 

Ultimately the question that will be asked is whether the benefits of allowing law enforcement agencies to access any data on any device they wish outweight allowing that same right to criminals. My concern is that the decision will be made by judges who don’t understand the complexities behind what is being asked …and that the decision made will be an emotional one. After all, it’s very hard to argue against a side that is offering answers and resolution to the grieving family and friends of  the 14 victims of a tragic shooting.  

extramile

About ExtraMile

A digital marketing agency with international capabilities