Phishing License
Released Prisoners And Arrestees Forced To Accept And Use Prepaid Cards

Update: Tesla Engineers Say Crash Due To Brakes, Not Autopilot Feature

About the fatal crash in May of a Tesla Model S car operating beta-version software for its Autopilot feature, the company's engineering executives told the U.S. Senate during committee hearings that the vehicle's brakes were at fault. The New York Times reported:

"... Tesla told members of the Senate Commerce Committee staff on Thursday that the problem involved the car’s automatic braking system, said the staff member, who spoke on condition of anonymity. It was not clear how or why Tesla considers the automatic braking system to be separate from Autopilot, which combines automated steering, adaptive cruise control and other features meant to avoid accidents. Tesla declined to comment... The company told the committee staff that it considered the braking systems as “separate and distinct” from Autopilot, which manages the car’s steering, and can change lanes and adjust travel speed..."

Auto experts say that the Autopilot feature and brakes should work together. So, either the car didn't recognize that it had to stop, or it failed to stop when it should have. The Autopilot feature requires the driver to be ready to assist, if needed. The National Highway Traffic Safety Administration (NHTSA) is investigating the crash.

Consumer Reports, which has tested vehicles for decades, has called for automakers to not use people as "guinea pigs for vehicle safety beta programs."

While the fatal Tesla crash was tragic, it is also a reminder for consumers to:

  • Know the differences between full autonomous automation and features that assist drivers,
  • Know the limitations of automation features including road conditions that require driver intervention,
  • Know which features use beta-version software (which means they are unfinished and still being tested), and
  • Read all applicable polices (e.g., terms of service, privacy) before and after purchasing a vehicle to understand your responsibilities and liability. Certain features and road conditions require driver intervention.


Feed You can follow this conversation by subscribing to the comment feed for this post.

Chanson de Roland

Apparently, the brakes worked well enough to come to a stop, and then suddenly failed when the Tesla car should have recognized the Semi-truck and come to or remained at a stop. So this sounds like BS to me, which is nothing but a lawyer's statement to not technically lie in testimony to Congress, which is a federal felony, while avoiding making a statement that would prejudice Tesla in any litigation and/or in the NHTSA's (National Highway Traffic Safety Administration's) investigation of the accident.

It all depends on what the meaning of brakes is.


As a non-lawyer and layperson, the engineers' statements before the Senate committee sounded like BS to me, too, especially given the Autopilot feature description on Tesla's site (which I included in my July 5th blog post):

"Autopilot allows Model S to steer within a lane, change lanes with the simple tap of a turn signal, and manage speed by using active, traffic-aware cruise control. Digital control of motors, brakes, and steering helps avoid collisions from the front and sides, and prevents the car from wandering off the road. Autopilot also enables your car to scan for a parking space and parallel park on command..."

A car can't parallel park without using its brakes. If Autopilot doesn't use the car's breaks, then they should have named the feature Autosteer.



The comments to this entry are closed.