A crash that killed a driver in a Tesla Model S electric car in self-driving mode has called into question the safety of driverless vehicle technology. This week, federal officials announced the launch of a formal investigation into the accident.
The crash occurred on May 7 in Williston, Florida, when a tractor-trailer made a left turn in front of the Tesla, and the car failed to apply the brakes, the New York Times reported. It is the first known fatal accident involving a self-driving vehicle.
In a statement from Tesla that was posted on the company’s blog Thursday (June 30), the automaker noted that the fatality was the first “in just over 130 million miles where Autopilot was activated.” “It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled,” Tesla officials wrote.
The Model S is not a self-driving car, but Tesla’s Autopilot feature is an assistive technology and a first step in bringing truly driverless cars to market. By means of computer software, sensors, cameras and radar, the car’s Autopilot feature can complete tasks like merging onto a highway, the Atlantic reported. Drivers are instructed to keep their hands on the wheel while in Autopilot mode.
Tesla did not specify in their statement how engaged the driver was at the time of the crash, but did note that: “Neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied.”
Other companies, like General Motors and Google, have invested in the development of driverless car technology. In February one of Google’s self-driving cars crashed into a bus, though there were no reported injuries.
As tests on autonomous vehicles continue, the question is whether the technology has progressed to the point that the government would approve cars that can drive themselves.
In fact, a study published in October 2015 found that self-driving cars are more likely to be in an accident. The study, conducted by the University of Michigan’s Transportation Research Institute, found that per million miles traveled, self-driving cars had a higher crash rate than traditional cars. At the time of the study, no self-driving cars had been found at fault for the crashes they were involved in.
There’s also a moral dilemma at play, as a driverless vehicle may have to decide which lives to save in the event of a serious accident. A recent study published in the journal Science found that people approve of autonomous vehicles (AV) governed by utilitarian ethics —minimizing the total number of deaths in a crash, even if people in the vehicle were harmed. However, most respondents would not want to ride in those vehicles themselves, Live Science reported.
“The moral dilemma for AV is something that is brand-new,” said study co-author Jean-François Bonnefon, a research director at the Toulouse School of Economics in France. “We’re talking about owning an object, which you interact with every day, knowing that this object might decide to kill you in certain situations.”
Original article on Live Science.