Why Apple should comply with the FBI: Cybersecurity expert

Apple's Tim Cook is resisting a court order to help the FBI unlock an iPhone used by the perpetrators of the San Bernardino, California terrorist attacks of December 2015 that left 14 dead and 22 wounded. Here's why I believe Apple should comply.


First, there have been proposals (including legislation in New York and California) to prohibit the sale of cellphones with encryption capabilities that prevent this kind of access during criminal investigations. Those proposals essentially legislate the inclusion of a backdoor in these devices, and such requirements would be a bad idea.

Any backdoor decreases the security of the data on the device, not just from the government, but also from 1) criminals that learn how to exploit the backdoor, 2) insiders who have been bribed or otherwise use that access for unauthorized purposes, and 3) foreign governments that would demand the same access to data that has been granted to our government.

This case is not about a backdoor, however. Instead it is about access to data on an existing phone, which like many recent cellphones, encrypts the data on the phone itself in a way that, as Tim Cook described back in September of 2014, prevents even Apple from retrieving the data.

While the court order provides the legal authority to retrieve the data, the court cannot order Apple to do something that is not technically possible. That is why the demands in the order are different, to provide assistance to bypass the separate auto-erase function and delays between pass-code guesses. This assistance would enable the government to "brute-force" the login screen in approximately 10,000 attempts.


We do not know whether it is possible to bypass these mechanisms. Apple has responded that it is being asked to "create something too dangerous to create. They have asked us to build a backdoor to the iPhone." That response is not technically correct. For it to be a backdoor, Apple would have to add it to phones that are shipped, and as I discussed earlier, I agree that we should not be adding a backdoor to our devices.

In this matter, if it is technically possible to defeat the auto-erase and delay, the backdoor thus already exists in the devices and Apple is being asked to show the government how to get in, or perhaps more likely, Apple is being asked to use their technical knowledge of the phone to discover an existing backdoor and then show the government how to use it.



If such a flaw exists, then the flaw will inevitably be discovered by the hacker community, or foreign governments down the road. Hiding the flaw does not necessarily improve the security of their customers, but creating the "exploit kit" does expose customers to a greater risk of attack in the short term.

In this particular matter, the legal and the ethical authority for the search of this device exists. What Apple is being asked to do with respect to this device does not reduce the security of other phones.

While we should resist laws that would mandate the addition of backdoors, requested assistance does not constitute the creation of a backdoor and Apple should provide the requested assistance. In so doing, perhaps Apple will discover a vulnerability which they can fix in future devices.


Commentary by Prof. B. Clifford Neuman, director, Center for Computer Systems Security, Information Sciences Institute, University of Southern California. Follow him on Twitter @BCNeuman.

For more insight from CNBC contributors, follow @CNBCopinion on Twitter.