Tesla’s advanced driver-assistance system Autopilot is coming under increased scrutiny just months before CEO Elon Musk planned to put fully self-driving cars on the streets, after three crashes that killed three people, AFP reported on January 3rd.
On Sunday December 29th, a Tesla Model S sedan left a motorway in Gardena, California, at high speed, jumped a red light and struck a Honda Civic, killing the two passengers, according to police. A man and woman in the Tesla were hospitalised with non-life-threatening injuries according to The Guardian which included footage appearing to show the Tesla driver asleep at the wheel in its report of the incident. Authorities have not yet said whether Autopilot was involved in the crash.
The same day, a Tesla Model 3 hit a parked fire engine on an Indiana motorway, killing a passenger in the Tesla. On December 7th, a Model 3 struck a police car on a Connecticut main road, though no one was hurt.
The National Highway Traffic Safety Administration (NHTSA) is investigating the Connecticut crash, in which the driver told police that the car was operating on Autopilot, which is designed to ensure that a car stays in its lane and a safe distance from other vehicles and can change lanes on its own.
Tesla has repeatedly said that its Autopilot system is designed only to assist drivers, and they must still pay attention and be ready to intervene at all times. The company states that Teslas with Autopilot are safer than vehicles without it but advises that the system does not prevent all crashes.
Doubts about Tesla's Autopilot system have long persisted. David Friedman, vice president of advocacy for Consumer Reports and a former acting NHTSA administrator, said the agency should have declared Autopilot defective and recalled the product after a 2016 crash in Florida that killed a driver. Neither Tesla's system nor the driver had braked before the car went underneath a semi-trailer that had turned in front of the car.
"We don't need any more people getting hurt for us to know that there is a problem and that Tesla and NHTSA have failed to address it," Friedman said. "The public is owed some explanation for the lack of action," he said. "Simply saying they're continuing to investigate - that line has worn out its usefulness and its credibility."
In a statement, NHTSA said it relied on data to make decisions, and if it finds any vehicle poses an unreasonable safety risk, "the agency will not hesitate to take action." NHTSA also has said it does not want to stand in the way of technology given its life-saving potential.
In April, Musk said he expected to start converting the company's electric cars to fully self-driving vehicles in 2020 in order to create a network of robotic taxis to compete against Uber and other ride-hailing services.
At the time, experts said the technology was not ready and that Tesla's camera and radar sensors were not good enough for a self-driving system. Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University, and others say additional crashes have proved that to be true.
Many experts say they are not aware of fatal crashes involving similar driver-assist systems from General Motors (GM), Mercedes and other automakers. GM monitors drivers with cameras and will shut down the driving system if they do not watch the road.
"Tesla is nowhere close to that standard," Rajkumar said predicting more deaths involving Teslas if NHTSA fails to take action, "This is very unfortunate… Just tragic, " he said.