In the press lately, the FBI has been putting pressure on Apple to build them an iOS back door. The request has been in the name to stop terrorism. But as the Wall St. Journal reports, there are more phones/cases at stake. This has huge consequences and certainly sets major precedents if allowed. Let alone what comes from the law of unintended consequences. Today’s iOS devices obviously stores emails, texts, photos and such. This is what the FBI on the surface says is the main reason for the request to gain special access to the device is to establish a motive. But it also has password vaults, biometric data (with the Touch ID), health data with the Health app, Quest Diagnostics’ Gazelle app (for lab results), and it’s peers, enormous amounts of financial data between financial related apps and Apple Pay. With the app store, the amount of personal data, the sky’s the limit depending on what the user downloaded. Apple has worked very hard to ensure the security of all this data so that it is secured and convenient as more functionality gets rolled into the device. It’s a very fine mix of complex security measures and ease of use. According to Bloomberg, US Mobile Payment are expected to exceed $110 Billion by 2018. That’s an enormous mount of secure transactions thru mobile devices.
The American Bar Association also says “Healthcare organizations and their business associates should remember that securing mobile devices that access or maintain electronic Personal Health Information (PHI) is not just good practice, but it is the law.” The civil penalty is $50,000 per violation. This has resulted in major litigation without a “back door” into the devices. Thumb print biometric data certainly falls under PHI. According to Bitglass, the study showed that 68 percent of security breaches were due to the loss or theft of mobile devices or files. If Apple is to have to create a back door and increase venerability chances, where does that liability lie if it falls into the hands that are not law enforcement for fraudulent or criminal purposes?
While we are not experts in civil liberties and HIPPA litigation, cyber security is all about having the strongest encryption and data protection as possible. China, North Korea, as well as other nations have been accused of state-sponsored cyber security attacks lately, opening the device because of one suspect seems like an over reach when traditional detective work, wiretaps, etc. are still affective means of investigation. As more and more data moves to portable devices, security should be increased not decreased in the name of investigating terrorist motives. The precedent is huge and Apple needs to hold it’s ground. It will be interesting to see how this plays out, most likely in the courts, as the counter-party, liability, and data breach risks are pretty great. Let alone the financial implications.
This quote sums up Apple’s position and dilemma so, so well: “If a court can ask us to write this piece of software, think about what else they could ask us to write. I don’t know where this stops.” – Tim Cook