Earlier this week, The Atlantic website published and interview with Francine Berman, a computer-science professor at Rensselaer Polytechnic Institute, about the need for a code of ethics for connected, autonomous devices, commonly referred to as the internet-of-things (IoT). The IoT is exploding.
Experts forecast 8.4 billion connected devices in use worldwide in 2017, up 31 percent from 2016. Total spending for those devices will reach almost $2 trillion in 2017, and $20.4 billion by 2020. North America, Western Europe, and China, which already comprise 67 percent of the installed base, will drive much of this growth.
In a February, 2017 article (Adobe PDF) in the journal Communications of the Association for Computing Machinery, Berman and Vint Cerf, an engineer, discussed the need for a code of ethics:
"Last October, millions of interconnected devices infected with malware mounted a "denial-of-service" cyberattack on Dyn, a company that operates part of the Internet’s directory service. Such attacks require us to up our technical game in Internet security and safety. They also expose the need to frame and enforce social and ethical behavior, privacy, and appropriate use in Internet environments... At present, policy and laws about online privacy and rights to information are challenging to interpret and difficult to enforce. As IoT technologies become more pervasive, personal information will become more valuable to a diverse set of actors that include organizations, individuals, and autonomous systems with the capacity to make decisions about you."
Given this, it seems wise for voters to consider whether or not elected officials in state, local, and federal government understand the issues. Do they understand the issues? If they understand the issues, are they taking appropriate action? If they aren't taking appropriate action, is due to other priorities? Or are different elected officials needed? At the federal level, recent events with broadband privacy indicate a conscious decision to ignore consumers' needs in favor of business.
In their ACM article, Bermand and Cerf posed three relevant questions:
- "What are your rights to privacy in the internet-of-things?
- Who is accountable for decisions made by autonomous systems?
- How do we promote the ethical use of IoT technologies?"
Researchers and technologists have already raised concerns about the ethical dilemmas of self-driving cars. Recent events have also highlighted the issues.
Some background. Last October, a denial-of-service attack against a hosting service based in France utilized a network of more than 152,000 IoT devices, including closed-circuit-television (CCTV) cameras and DVRs. The fatal crash in May of a Tesla Model S car operating in auto-pilot mode and the crash in February of a Google self-driving car raised concerns. According to researchers, 75 percent of all cars shipped globally will have internet connectivity by 2020. Last month, a security expert explained the difficulty with protecting connected cars from hackers.
And after a customer posted a negative review online, a developer of connected garage-door openers disabled both the customer's device and online account. (Service was later restored.) Earlier this year, a smart TV maker paid $2.2 million to settle privacy abuse charges by the U.S. Federal Trade Commission (FTC). Consumers buy and use a wide variety of connected devices: laptops, tablets, smartphones, personal assistants, printers, lighting and temperature controls, televisions, home security systems, fitness bands, smart watches, toys, smart wine bottles, and home appliances (e.g., refrigerators, hot water heaters, coffee makers, crock pots, etc.). Devices with poor security features don't allow operating system and security software updates, don't encrypt key information such as PIN numbers and passwords, and build the software into the firmware where it cannot be upgraded. In January, the FTC filed a lawsuit against a modem/router maker alleging poor security in its products.
Consumers have less control over many IoT devices, such as smart utility meters, which collect information about consumers. Typically, the devices are owned and maintained by utility companies while installed in or on consumers' premises.
Now, back to the interview in The Atlantic. Professor Berman reminded us that society has met the ethical challenge before:
"Think about the Industrial Revolution: The technologies were very compelling—but perhaps the most compelling part were the social differences it created. During the Industrial Revolution, you saw a move to the cities, you saw the first child-labor laws, you saw manufacturing really come to the fore. Things were available that had not been very available before..."
Well, another revolution is upon us. This time, it includes changes brought about by the internet and the IoT. Berman explained today's challenges include considerations:
"... we never even imagined we’d have to think about. A great example: What if self-driving cars have to make bad choices? How do they do that? Where are the ethics? And then who is accountable for the choices that are made by autonomous systems? This needs to be more of a priority, and we need to be thinking about it more broadly. We need to start designing the systems that are going to be able to support social regulation, social policy, and social practice, to bring out the best of the Internet of Things... Think about designing a car. I want to design it so it’s safe, and so that the opportunity to hack my car is minimized. If I design Internet of Things systems that are effective, provide me a lot of opportunities, and are adaptive, but I only worry about really important things like security and privacy and safety afterwards, it’s much less effective than designing them with those things in mind. We can lessen the number of unintended consequences if we start thinking from the design stage and the innovation stage how we’re going to use these technologies. Then, we put into place the corresponding social framework."
Perhaps, most importantly:
"There’s a shared responsibility between innovators, companies, the government, and the individual, to try and create and utilize a framework that assigns responsibility and accountability based on what promotes the public good."
Will we meet the challenge of this revolution? Will innovators, companies, government, and individuals share responsibility? Will we work for the public good or solely for business growth and profitability?
What do you think?