« What Is Driving The Privacy And Security Focus At Apple | Main | The Information The FBI Found After Unlocking The San Bernardino Attacker's iPhone »

Wednesday, April 20, 2016


Feed You can follow this conversation by subscribing to the comment feed for this post.

Chanson de Roland

I take issue with the interpretation of CCOA as prohibiting covered entities from making strong, or effective, encryption, with the caveat that I haven't studied either CCOA or the relevant legal authorities to do more than offer an off-the-cuff opinion. But CCOA requires only that a covered entity either provide the government with information or data in an intelligible form or provide any necessary technical assistance to render the information or data intelligible. But the necessary exception is that no covered entity can be required to provide either information or data in an intelligible form or provide technical assistance to render the information or date intelligible, when no such intelligible information or data or technical means of rendering the information or data in an intelligible form exists.

That is, Apple or any covered entity could arguably make unbreakable encryption where it can't provide information or data in an intelligible form or provide the necessary technical assistance to render information or data in intelligible form, because none exists. Congress really can't easily close this exception without actually forbidding a covered entities from making unbreakable encryption, which then would, in effect, mandate that a covered entity always provide a backdoor into its computing devices. Only then would effective encryption be designed by outlaws, unless a court of competent jurisdiction were to hold that CCOA, as presently drafted, does, as a matter of law, prohibit a covered entity from making and implementing effective encryption on its computing devices, which is something that I doubt a court would do for a law that imposes criminal sanctions. But, as CCOA’s express language now reads, it does not prohibit a covered entity from making effective encryption, so the exception, supra, stands until a court holds otherwise.

But our government's efforts to mandate backdoors into a covered entities computing devices does represent a perverse sort of progress. It means that covered entities, as led by Apple, Inc. (Apple), are finally standing up to challenge our government's insistence on unlocking and decrypting their devices (Unlocking), when and where that Unlocking is at least legally dubious, if not illegal, or where that Unlocking impose serious risks to the security of all of our computing devices and/or that Unlocking would impose such a burden on a covered entity or that entity's industry as to impair its competitiveness or otherwise impair its business.

This is a new development, this challenging the government in Congress and the courts, because, for a very long time, our intelligence agencies, as has been reported and as lead by the NSA, National Security Administration, have been engaged in an organized effort to build backdoors into the entire information infrastructure of computing devices and networks, both wired and wireless, using every available means, including technical, legal, deceit, fraud, threats, court orders, and/or recruiting the compliant, to build backdoors into the entire information infrastructure, and they have been very successful at it, so that now backdoors exists throughout our information infrastructure. And, of course, our intelligence agencies, once agains led by the NSA, have also relied on in-house and third-party experts to devise means of Unlocking computing devices and discovering or creating backdoors into networks. So the recent challenges to government's insistence on backdoors signals that covered entities' supine or prostituted compliance is over and/or that their ability and intent to build effective encryption into their devices has progressed to the point where our government, i.e., the NSA, is frightened that it won't be able to Unlock those devices.

Congress is now seized of the issue and all of its grave implications for security, privacy, the constitutional rights of U.S. citizens, particularly their First and Fourth Amendments rights, and of the great negative commercial impact that outlawing effective encryption will have on covered entities that are subject to jurisdiction of U.S. courts, i.e., particularly U.S. tech firms, weighed against whatever value outlawing effective encryption will have for benefiting our security and other intelligence gathering.


Perhaps, I should have mentioned that Riana Pfefferkorn, the author of the Just Security article, is the Cryptography Fellow at the Center for Internet and Society (CIS) at Stanford Law School. You can read her credentials here:

And, the CIS site saw fit to reprint her article here:


Chanson de Roalnd

After reviewing CCOA and the Ms. Pfefferkorn article, my argument still holds. The version of iOS that Apple would distribute would be subject to Apple being able to breach its security, as CCOA requires, unless the owner/users exercised his option, as provided by Apple, to block updates to his Apple device's operating system. CCOA does not prohibit Apple or anyone else from distributing a hackable version of its operating systems, which the user, not Apple, can make un-hackable, i.e., unbreakable, by blocking all updates. I can do that now on all of my Apple devices, that is, I can block all updates. If the user exercised that option, which already exists in Apple’s devices, Apple would simply be incapable of satisfying paragraph (a) of CCOA, even though it distributed a hackable device, because it could not push/install an update, GovtOS, to that device which would remove the device's security features. To eliminate that exception in a clear and unambitious way, CCOA would have to expressly command that all covered persons at least not provide a user with the option to block updates to his device’s operating system, which it does not do.

Yet, if CCOA, where to mandate that no covered person could provide a owner/user with the option to block updates to his device's operating system, that would immediately lead to absurd, dangerous, and impractical results. First, such a device would be immediately useless for business and much personal computing. In business, the IT department must be able to control whether, how, when, and what updates install on the firm's computing devices, because it must be able to test the validity, authenticity, compatibility, and security of any update before permitting it to be installed on what could be thousands of computing devices. And individuals, even though they are much more lax in policing the security and function of their computing devices, have similar needs and face similar problems for privacy, security, compatibility, etc.

Losing control of whether, how, when, and what updates are installed to one's computing devices is also immensely dangerous. If an update is compromised by malware and the owner can’t block its installation, it will result in an utter loss of privacy and security and/or functionality, with all the resulting mischief that implies.

And then, of course, there is the absurdity of having a devices whose operating system others can modify without you having any say in the matter.

And, as others have noted, the foregoing inability to control how one's computing devices are updated will be exploited not only by our intelligence and law enforcement agencies but by every technically sophisticated foreign intelligence agency and by criminal hackers. Not that Vladimir Putin would find the information on my computing devices that interesting. I am sure that he found information on Hillary Clinton’s personal email server much fascinating reading. However, criminals might find hacking me profitable.

Who would want to buy, much less use, such a device, a device where you can't control updates to its software, a device that is potentially at any moment a hostile agency. Certainly, the foreign customers of U.S. tech firms will take their custom elsewhere. And, as the Editor notes, such a mandate would be a vain thing against competent criminals, terrorists, and foreign agents, because they will root their devices to be able to block updates, so that even a covered entity won't be able to hack it. Then the only people subject to having their computing devices hacked under CCOA’s authority will be ordinary people and lazy, stupid, and/or ignorant criminals and terrorists.

But, as drafted CCOA, does permits, I think, a covered entity to distribute devices that are hackable by an update to its operating system, as required by CCOA, but which gives the user the option of blocking all updates to that device--which, once again, already exists for Apple’s devices--thus making that device un-hackable, i.e., unlock-able, by even the covered entity.

Will Congress go so far as to remove the ability to control updates to his/its computing devices from the owners of those devices by taking away the owner’s option to block updates? I think that Congress doing so would violate the constitutionally protected property and/or privacy rights of those owners. But, more importantly, it would signal that the Congress has transformed the United States into a surveillance state, into Big Brother. That Big Brother might be benign or even beneficent at first, but the ability to breach everyone privacy must inevitably break all bound of law and morality, because constitutional and democratic government is incompatible with the power to breach everyone’s privacy, which must inexorably leads to wickedness. And then there are the criminals, terrorists, and despotic foreign powers, who are bounded by only their desires and needs. Is this what Congress wants?

The comments to this entry are closed.


  • Updates via E-mail RSS Feed Updates via Twitter Updates via Facebook


  • Bloggers' Rights at EFF
  • George Jenkins, author of the I've Been Mugged Blog


  • © 2007 - 2017. George Jenkins. All Rights Reserved.


  • <$MTStatsScript$>