Why iPhones Are Now Less Secure, And How This Affects Everyone
Thursday, March 31, 2016
Tuesday's blog post discussed the announcement by the U.S. Department of Justice (DOJ) that it had withdrawn its lawsuit against Apple, Inc. because the Federal Bureau of Investigation (FBI), with the help of an unnamed third party, had successfully unlocked the San Bernardino attacker's iPhone and accessed the information in the device. That blog post also discussed several related issues and implications. The government did not disclose the exact method it used to unlock the iPhone.
Today's blog post explores another related issue: whether the government will inform Apple of the vulnerability. With information about the vulnerability, Apple can improve the security of its iPhones. That will help all iPhone users better protect their privacy. The Washington Post reported:
"The FBI plans to classify this access method and to use it to break into other phones in other criminal investigations."
The article described how security research usually works. When security engineers find a vulnerability, they inform the developer so a fix can be quickly built and distributed to users. Also, other developers learn:
"Vulnerabilities are found, fixed, then published. The entire security community is able to learn from the research, and — more important — everyone is more secure as a result of the work. The FBI is doing the exact opposite... All of our iPhones remain vulnerable to this exploit."
No doubt, the FBI and other U.S. government law enforcement (and spy) agencies will use the vulnerability to unlock more iPhones. People forget that iPhones are used by:
"... elected officials and federal workers and the phones used by people who protect our nation’s critical infrastructure and carry out other law enforcement duties, including lots of FBI agents... The problem with computer vulnerabilities is that they’re general. There’s no such thing as a vulnerability that affects only one device. If it affects one copy of an application, operating system or piece of hardware, then it affects all identical copies..."
The worst case scenario: by withholding vulnerability information, the government fosters a situation where Apple products are less secure than other brands developed abroad, whose governments freely shares vulnerability information. That could negatively affect the tech company's revenues and profitability... meaning lost jobs here.
There is one tiny bit of good news in this mess (bold added):
"The FBI did the right thing by using an existing vulnerability rather than forcing Apple to create a new one, but it should be disclosed to Apple and patched immediately."
So now, the bad guys - criminals, hackers, other governments' spy agencies -- know for sure that a vulnerability exists in newer iPhones. If they look hard enough and long enough, they can find it, too. (Many of the bad guys hire skilled, experienced engineers, too.) Once found, they too can use the vulnerability to hack iPhones.
The government's decision to classify the vulnerability seems myopic at best, and at worse extremely unfriendly to users and business. This weakens our defenses. It does not make our defenses stronger.
The government's approach seems to be surveillance trumps privacy. You could say: surveillance by any means necessary (sorry, Malcolm) and damn the consequences. Damn the collateral damage.
Is this wise? Ethical? Is this how you want your government to operate? Was there a debate about this? Did you provide any input to your elected officials? Have they listened?
Our government's law enforcement, intelligence, and military services, like all technically sophisticated government law enforcement and military and intelligence services, maintains an arsenal of zero-day exploits for smartphones and all other types of computing devices. They all do that so that they can gather intelligence, conduct cyber warfare, and investigate crime. In fact, there is an industry of third-party contractors who are paid to discover and provide government's law enforcement, military, and intelligence agencies with exploits, and those agencies, principally the N.S.A. has it own engineers to discover such exploits, usually for gathering intelligence and providing cyber war capability, though they are rarely also used for law enforcement. This is true for the U.K., Israeli, U.S., Russian, Chinese et al. intelligence, military, and law enforcement services. And for most of these services, such zero-day exploits are never disclosed to manufacturers or publicly disclosed in any way, except where the risk of harm to the nation from not disclosing the exploit is deemed to be greater than the benefit for law enforcement, military uses, and/or intelligence gathering.
There is no law that requires the FBI or any other U.S. law enforcement or intelligence gathering service or the military to disclose its arsenal of zero-day exploits. Nor should there be, for that would result in the unilateral disarmament of the U.S. military, intelligence, and law enforcement services in the face adversaries, hostile forces, and people of nefarious intent, such as terrorists.
What the U.S. does instead of disclosing all of its zero-day exploits is to decide which to disclose and which to keep secret. The committee that does that was setup by the President's order. It is the Vulnerabilities Equities Process. The Vulnerabilities Equity Process (Equity Process) is an senior inter-agency review process that uses a number of factors to decided, under the circumstances, whether to disclose an exploit. However, what those factors are and how the Equities Process makes its decisions is a secret, though we are told that the process is biased toward disclosing exploits to provide greater security and defense for commercial and government computing systems. See http://www.wired.com/2014/04/obama-zero-day/ and http://www.nytimes.com/2014/04/13/us/politics/obama-lets-nsa-exploit-some-internet-flaws-officials-say.html?_r=1. The President has decided that the bias toward disclosure and defense best servers the national interests, but, if that bias is deemed to be a net harm the national interest for a particular exploit, under the prevailing circumstances, the Equities Process will decide not to disclose the exploit. And, of course, particularly close calls are left to the President.
Yet without knowing how the Equities Process reaches it decisions, vaguely stating that there is a bias toward disclosure and defense might effectively result in de facto non-disclosure for the purposes of intelligence gathering and enhancing cyberwar capabilities or even for law enforcement against ordinary crimes that don’t risk national security. But, of course, the Equity Process is a top-secret or even above top-secret process, so it can’t and shouldn’t be disclosed to the public, because to do so would result in grievous harm to U.S. interests. So we are left with hoping that we elect the right President, who will make the right decisions.
And there is the fact that not all systems are treated the same. Admiral Rogers, in his confirmation hearing to head N.S.A. and Cyber Command, said as much: There will be times when an exploit is deemed to important to disclose so that only certain critical military, governmental, and infrastructure systems will be secured, while makers of commercial systems, such smartphones and personal computers, will not be informed of the exploit, much less advised on how to patch it. See Wired report linked to supra. What this means for the Apple and its ilk is that they are on their own in securing against at least certain exploits, and that they, therefore, must retain their own engineering staffs and contractors to discover and patch exploits, for they can not reliably depend on N.S.A. to disclose all of its exploits to them.
And does Apple and its ilk have any remedy in court to force the government to disclose the exploits that it knows of? The short answer is no. There is no law that requires the government to disclose known exploits, nor is Congress likely to pass any such law, nor should it for the reasons that I stated, supra. And the Equity Process would easily qualify as a state secret so that a federal court would quickly dismiss any action on either of two unassailable grounds: First, the cause of action to force disclosure is a complaint for which the law does not grant relief, and, second, even if it did, the state secrets doctrine would require dismissal.
So Apple and its ilk must discover and patch their own exploits.
Posted by: Chanson de Roland | Thursday, March 31, 2016 at 02:20 PM