Tuesday's blog post discussed the announcement by the U.S. Department of Justice (DOJ) that it had withdrawn its lawsuit against Apple, Inc. because the Federal Bureau of Investigation (FBI), with the help of an unnamed third party, had successfully unlocked the San Bernardino attacker's iPhone and accessed the information in the device. That blog post also discussed several related issues and implications. The government did not disclose the exact method it used to unlock the iPhone.
Today's blog post explores another related issue: whether the government will inform Apple of the vulnerability. With information about the vulnerability, Apple can improve the security of its iPhones. That will help all iPhone users better protect their privacy. The Washington Post reported:
"The FBI plans to classify this access method and to use it to break into other phones in other criminal investigations."
The article described how security research usually works. When security engineers find a vulnerability, they inform the developer so a fix can be quickly built and distributed to users. Also, other developers learn:
"Vulnerabilities are found, fixed, then published. The entire security community is able to learn from the research, and — more important — everyone is more secure as a result of the work. The FBI is doing the exact opposite... All of our iPhones remain vulnerable to this exploit."
No doubt, the FBI and other U.S. government law enforcement (and spy) agencies will use the vulnerability to unlock more iPhones. People forget that iPhones are used by:
"... elected officials and federal workers and the phones used by people who protect our nation’s critical infrastructure and carry out other law enforcement duties, including lots of FBI agents... The problem with computer vulnerabilities is that they’re general. There’s no such thing as a vulnerability that affects only one device. If it affects one copy of an application, operating system or piece of hardware, then it affects all identical copies..."
The worst case scenario: by withholding vulnerability information, the government fosters a situation where Apple products are less secure than other brands developed abroad, whose governments freely shares vulnerability information. That could negatively affect the tech company's revenues and profitability... meaning lost jobs here.
There is one tiny bit of good news in this mess (bold added):
"The FBI did the right thing by using an existing vulnerability rather than forcing Apple to create a new one, but it should be disclosed to Apple and patched immediately."
So now, the bad guys - criminals, hackers, other governments' spy agencies -- know for sure that a vulnerability exists in newer iPhones. If they look hard enough and long enough, they can find it, too. (Many of the bad guys hire skilled, experienced engineers, too.) Once found, they too can use the vulnerability to hack iPhones.
The government's decision to classify the vulnerability seems myopic at best, and at worse extremely unfriendly to users and business. This weakens our defenses. It does not make our defenses stronger.
The government's approach seems to be surveillance trumps privacy. You could say: surveillance by any means necessary (sorry, Malcolm) and damn the consequences. Damn the collateral damage.
Is this wise? Ethical? Is this how you want your government to operate? Was there a debate about this? Did you provide any input to your elected officials? Have they listened?