Will Criminals and Oppressive Regimes Get a Boon From Technology the Government Wants to Force Apple to Develop?   

 I imagine most everyone is at least peripherally aware of the debate raging between the U.S. Justice Department and Apple. Anyone I’ve spoken with has the basic understanding that the FBI would simply like for Apple to hack an iPhone 5c belonging to a now deceased terrorist involved in the recent mass shooting in San Bernardino that took 14 innocent lives and injured many others.

Wouldst that it were that simple. As with most is issues concerning technology, privacy and law enforcement there are many layers to be considered.

A key aspect of this quandary is the governments’ right to force a company with no specific knowledge of a crime, and under no criminal investigation to “engineer” a means (actually a tool) to defeat data security measures they built into devices to protect their customers private information.  Apple is extremely serious about offering bullet-proof data security for iPhone customers. So much so that they have even denied themselves potential access via some “back door” that could have been hidden within the new iOS 9 operating system. In addition, they engineered a tiny delay to make it much more time consuming and costly for hackers to use super computing power to guess pin codes and an “auto-erase” feature that only allows a certain number of tries before erasing all data from the phone. Pretty nifty huh? Now the FBI wants them to basically re-design the system without the delay and “auto-erase” features and install it in just this one particular phone.

Here’s what Apple CEO Tim Cook had to say about that: “The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.”

Can you say “Pandora’s Box”? That’s what I think Mr. Cook is suggesting the government would like Apple to open. If they are forced to develop this “tool”, evidence gleaned with it could potentially be used in a criminal prosecution. The defense attorney would be duty bound to call it into question and ironically, Apple would be put in the position of having to defend a forensic tool they didn’t want to create in the first place. How would they defend it? Not without some serious third party involvement and an extreme potential for it to be leaked. To support the validity of evidence uncovered with the tool it would have to be peer reviewed, tested on numerous other devices, given to forensics experts and defense experts and explained on the witness stand. In the likely event of an appeal Apple gets to go through much of the process all over again. They might even have to defend themselves in lawsuits brought by anyone convicted on evidence garnered with the tool!

If the government gets its way, various agencies and private forensics companies may get a rare opportunity to ingest and reverse engineer a “magic key”. A private forensics company could, in theory, combine it with other tools and sell it as a commercial product. It might also get leaked to criminal hackers who would reverse engineer to potentially learn exciting new ways to steal private data or gain “injection points” to further undermine data security.

What about international ramifications? If the U.S. government can compel Apple to furnish a code breaking tool that can defeat Apples own most secure operating system to date, what happens when the Russian or Chinese government want one too? What if any oppressive regime makes giving them such a tool a condition of selling devices in that country? It’s not hard to imagine the civil and human rights violations running rampant. We can foretell with almost absolute certainty that many journalists, writers, activists, ordinary citizens and opposition leaders will end up with multiple decade prison sentences or in front of a firing squad. Heck, a corrupt enough regime could potentially use the tool to manufacture evidence to eliminate any group or individual they just happen to deem an “enemy of the state”.

It seems like a lot to ask from a company that’s just trying to do right by their customers in keeping our personal data as secure as possible.

 

 

Published by Bill Hoover