NHTSA Investigates Fatal Crash Of Tesla Auto. Numerous Implications For Drivers
Tuesday, July 05, 2016
Several news sites reported that the fatal crash of a Tesla Motors model S car while operated in Autopilot mode. Tesla Motors released a statement about the incident:
"... NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S. This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles... What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied..."
Established in 1970, the National Highway Traffic Safety Administration (NHTSA) is responsible for ensuring safety standards and safety on the nation's highways. Tesla's statement also described its Autopilot feature:
"... Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times," and that "you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again."
The Tesla site provides a general description of the Autopilot feature:
"Autopilot allows Model S to steer within a lane, change lanes with the simple tap of a turn signal, and manage speed by using active, traffic-aware cruise control. Digital control of motors, brakes, and steering helps avoid collisions from the front and sides, and prevents the car from wandering off the road. Autopilot also enables your car to scan for a parking space and parallel park on command. And our new Summon feature lets you "call" your car from your phone so it can come greet you at the front door in the morning. Autopilot features are progressively enabled over time with software updates."
This fatal crash has broad implications. The New York Times reported:
"The crash also casts doubt on whether autonomous vehicles in general can consistently make split-second, life-or-death driving decisions on the highway. And other companies are increasing investments in self-driving technology. Google, for example, recently announced plans to adapt 100 Chrysler minivans for autonomous driving. Earlier this year, G.M. acquired the software firm Cruise Automation to accelerate its own self-driving applications. Even as the companies conduct many tests on autonomous vehicles at both private facilities and on public highways, there is skepticism that the technology has progressed far enough for the government to approve cars that totally drive themselves."
In 2013, NHTSA defined five levels of automation in vehicles:
"No-Automation (Level 0): The driver is in complete and sole control of the primary vehicle controls – brake, steering, throttle, and motive power – at all times.
Function-specific Automation (Level 1): Automation at this level involves one or more specific control functions. Examples include electronic stability control or pre-charged brakes, where the vehicle automatically assists with braking to enable the driver to regain control of the vehicle or stop faster than possible by acting alone.
Combined Function Automation (Level 2): This level involves automation of at least two primary control functions designed to work in unison to relieve the driver of control of those functions. An example of combined functions enabling a Level 2 system is adaptive cruise control in combination with lane centering.
Limited Self-Driving Automation (Level 3): Vehicles at this level of automation enable the driver to cede full control of all safety-critical functions under certain traffic or environmental conditions and in those conditions to rely heavily on the vehicle to monitor for changes in those conditions requiring transition back to driver control. The driver is expected to be available for occasional control, but with sufficiently comfortable transition time. The Google car is an example of limited self-driving automation.
Full Self-Driving Automation (Level 4): The vehicle is designed to perform all safety-critical driving functions and monitor roadway conditions for an entire trip. Such a design anticipates that the driver will provide destination or navigation input, but is not expected to be available for control at any time during the trip. This includes both occupied and unoccupied vehicles."
Today's vehicles offer several safety automation features to assist drivers: Automatic Crash Notification (ACN), Automatic Emergency Braking (AEB), Electronic Stability Control (ESC), Forward Collision Warning (FCW), Lane Departure Warning (LDW), Lane Keeping Support, and Pedestrian Crash Avoidance/Mitigation. There are huge differences between autonomous automation and assisted-driving features.
There are big differences between Tesla cars and Google's self-driving car. Earlier this year, NHTSA granted the software in Google's driver-less cars as "driver" status. According to the Washington Post:
"... the law will treat the car's software as the driver. "We agree with Google its [self-driving vehicle] will not have a 'driver' in the traditional sense that vehicles have had drivers during the last more than one hundred years," the letter reads: "If no human occupant of the vehicle can actually drive the vehicle, it is more reasonable to identify the "driver" as whatever (as opposed to whoever) is doing the driving." The decision by NHTSA marks a huge moment for Google and the rest of the auto industry as it races to build the first fully autonomous motor vehicle. While most other carmakers are building their vehicles with steering wheels, brake pedals and other machinery in mind, Google imagines that its robot car will have none of these things."
The fatal Tesla accident is truly tragic. It is also a reminder for consumers to:
- Know the differences between full autonomous automation and features that assist drivers,
- Know the limitations of automation features including road conditions that require driver intervention,
- Know which features are beta version (which means they are unfinished and still being tested), and
- Read all applicable polices (e.g., terms of service, privacy) before and after purchasing a vehicle to understand your responsibilities and liability. Certain features and road conditions require driver intervention.
The features in automated vehicles depend upon software, and beta version software indicates software still being tested. Wise Geek provides a definition:
"The beta version of a software release is considered to be a preview; though it may include many standard features, it is not yet ready for wide release or sale. During this phase, the developers collect feedback from users about the product's functionality, including what they like and what should be changed before its wide release. A beta version of a program can be either "closed," which is limited to a specific group of users, or "open," which is available to the general public. During this testing, developers might release numerous versions of a program, including improvements and bug fixes with each iteration."
So, the software may have bugs or errors in it that affect the feature's performance and/or interaction with other features. And, government regulators seem satisfied with this. Reuters reported:
"Hours before the crash became public knowledge on Thursday, U.S. National Transportation Safety Board Chairman Christopher Hart said driverless cars will not be perfect. "There will be fatal crashes, that's for sure," Hart told the audience at the National Press Club in Washington, but added that will not derail the move toward driverless cars, even if the vehicles are not ready.. Former NHTSA chief Joan Claybrook said in an interview the agency needs to set performance standards for electronic systems like Autopilot. "It's the like Wild West. The regulatory system is not being used," Claybrook said."
It seems wise for consumers to know before purchase: a) the specific limitations of features (and associated sensors) using beta version software; b) when software testing will be completed and a final version available; c) if price discounts are available for features being tested; and d) if the limitations require more driver attention or driver intervention during specific road and/or weather conditions.
Also, a 2014 survey found that half of Americans don't know what a privacy policy is. It is difficult to find statistics about the percentage of users that read terms of service policies (a//k/a terms and conditions). The best estimate I've found is from 2008: 10 percent of consumers read terms of service policies. Even if that percentage is now double, it's still abysmal.
Should drivers place a lot of trust in features using beta version software? Do you view current regulatory activity as acceptable? Comments?
Another viewpoint suggests the Tesla software, which mistook the white side of a tractor-trailer truck as sky, might also have value as a weapon to trick semi-autonomous robots used during war:
"For all the talk of a 'robot-readable world,' in other words, it is interesting to consider a world made deliberately illegible to robots, with materials used for throwing off 3D cameras or LiDAR, either through excess reflectivity or unexpected light-absorption."
Robot War And The Future of Perception Deception
http://www.bldgblog.com/2016/07/robot-war-and-the-future-of-perceptual-deception/
George
Editor
http://ivebeenmugged.typepad.com
Posted by: George | Wednesday, July 06, 2016 at 01:33 PM
From Consumer Reports:
"Consumer Reports experts believe that these two messages—your vehicle can drive itself, but you may need to take over the controls at a moment’s notice—create potential for driver confusion. It also increases the possibility that drivers using Autopilot may not be engaged enough to to react quickly to emergency situations. Many automakers are introducing this type of semi-autonomous technology into their vehicles at a rapid pace, but Tesla has been uniquely aggressive in its deployment. It is the only manufacturer that allows drivers to take their hands off the wheel for significant periods of time, and the fatal crash has brought the potential risks into sharp relief."
And:
"By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security," says Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports. "In the long run, advanced active safety technologies in vehicles could make our roads safer. But today, we're deeply concerned that consumers are being sold a pile of promises about unproven technology. 'Autopilot' can't actually drive the car, yet it allows consumers to have their hands off the steering wheel for minutes at a time. Tesla should disable automatic steering in its cars until it updates the program to verify that the driver's hands are on the wheel... “Consumers should never be guinea pigs for vehicle safety 'beta' programs...”
http://www.consumerreports.org/tesla/tesla-autopilot-too-much-autonomy-too-soon/
George
Editor
http://ivebeenmugged.typepad.com
Posted by: George | Sunday, July 17, 2016 at 08:38 PM