Federal investigators with the National Transportation Safetty Board are reportedly set to find that Tesla, Inc.’s ($TSLA) self-driving system should share the blame in a fatal 2016 crash in which a Model S sedan collided with the side of a truck, killing former Navy Seal Joshua Brown.
The NTSB meets Tuesday to submit its findings, which will be subject to revision by board members. A preliminary report, however, concluded that Tesla’s automated system allowed Brown to effectively let the car drive itself for ‘long periods without steering or apparently even looking at the road,’despite warnings from Tesla that customers should maintain awareness over driving conditions and keep their hands on the wheel.
Brown, 40, was driving near Williston, FL on May 7, 2016 when his Tesla struck the side of a truck trailer that was making a left turn in front of him. There is no evidence that the car attempted to slow down or make evasive maneuvers, according to the NTSB.
Brown, who “loved technology,” believed the Tesla automation has saved lives, according to a statement released by his family on Monday through their attorneys. “We heard numerous times that the car killed our son,” said the statement issued by the law firm Landskroner Grieco Merriman LLC. “That is simply not the case.”
The statement also praised Tesla for improving its Autopilot software after the accident, changes it said were a direct result of the crash.
Tesla didn’t provide an immediate comment on the draft conclusions. The company said in a statement last year that customers had to acknowledge Autopilot’s limitations before it would allow the systems to operate. Every time the system is engaged, it reminds drivers: “Always keep your hands on the wheel. Be prepared to take over at any time.”
Following the accident, Tesla released an upgrade to its autopilot program which makes it more difficult for drivers to ignore warnings to put their hands on the steering wheel – stopping the car if the warnings are ignored and only resuming function after the car has been parked.
Tesla also modified how the auto-pilot system detects potential obstructions, such as the ability to identify the white side of a tractor trailer vs. a bright sky. The upgraded system emphasizes the use of radar over cameras, according to a company statement.
Bloomberg also reports that the safety board’s findings and recommendations could have broad implications for how self-driving technology is phased in on vehicles and trucks, and it comes as Congress is debating legislation to spur autonomous vehicle systems. Tech and auto companies are pouring billions of dollars into a race to develop self-driving vehicles, which carmakers from Tesla to Volvo Cars say could be deployed in less than 10 years.
A class-action lawsuit was filed in a California court in April against the automaker, alleging that the Tesla autopilot system is “dangerously defective” when engaged – with cars sometimes veering out of lanes, braking for no reason, or failing to stop when approaching other vehicles.
In response, Tesla said that they never claimed the vehicles have “full self-driving capability,” adding that the suit misrepresented the facts and is nothing more than a “disingenuous attempt to secure attorney’s fees.”If you enjoy the content at iBankCoin, please follow us on Twitter