In early May, a fatal car accident on a Florida highway seemed like just one of the many tragedies that occur on our roads daily. The driver, a 40-year-old from Canton, Ohio operating a Tesla car in Williston, Florida, had crashed into a tractor-trailer while traveling 65 mph.
But then in late June, federal authorities got roped into the investigation when it was learned the Tesla’s Autopilot system was engaged at the time of the crash. In effect, this was a self-driving car, which are touted as infinitely safer options than human-operated vehicles. After all, human error is the cause of 90 percent of all crashes. Yet despite the fact that the Autopilot system was engaged, neither the driver nor the system activated the brakes before the car struck the tractor-trailer.
Now the federal government wants answers. But Tesla insists the cars are safe. There are 70,000 Tesla cars on the road that have this Autopilot feature. Representatives with the company insist the systems are safe when used as intended, and the problem in these cases is more likely than not driver error. That is, they aren’t using the systems properly.
But is the fact that it’s a “self-driving” vehicle a bit of a misnomer? Are drivers being given a false sense of security about them?
An unnamed executive from Tesla said in an interview with The New York Times that while the system is safe, consumers who misuse the Autopilot feature are toying with life-and-death. He added that drivers have to be cognizant of road conditions and be willing and able to take control of the vehicle at a second’s notice. The vehicle can operate the speed, steering and controls for up to three minutes absent any driver involvement. But of course, even those who don’t have an autopilot feature in their vehicle fail to watch the road and engage in safe driving habits. There is concern that these vehicles are lulling drivers into believing these systems can do it all, when in fact they cannot. The executive said the issue is often a lack of customer education.
But who is supposed to be responsible for educating them? And who is liable when this happens again, as inevitably it will?
It will be interesting to see how the company responds to a series of questions submitted to it in a 9-page letter by the National Highway Traffic Safety Administration (NHTSA). The agency asserts there were two alleged defects in the vehicle: The failure of the automatic emergency braking system and the forward collision warning. The NHTSA wants information on the number of incidents in which the car manufacturer’s vehicles’ automatic braking system was activated and also wants the company to turn over any information on reported crashes or other incidents in which these same defects may have been an issue.
Traffic safety advocates have asserted that self-driving cars have big potential to save lives. However, the current driver-assist cars pose major risks to drivers mostly via mixed messages. They are sold on the idea that they can disengage from operating a vehicle, when in fact they cannot. This opens the door to manufacturer liability.
If you have been injured in an accident, contact the Hollander Law Firm at 888-751-7777 for a free and confidential consultation. There is no fee unless we win.
As U.S. Investigates Fatal Tesla Crash, Company Defends Autopilot System, July 12, 2016, By Bill Vlasic and Neal Boudette, The New York Times
More Blog Entries:
NHTSA: Traffic Deaths Up Almost 8 Percent, July 8, 2016, Boca Raton Injury Attorney Blog