At the end of June, after the first death in an accident involving the Tesla auto-pilot system, the company said in a press release: “This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles.” Sounds safe, sure, but data without context is generally misleading.
The company goes on to position the death as a rarity that confirms the overall safety of the Tesla auto-pilot:
Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.
Tesla doesn’t provide sources for its U.S. driving fatality per miles driven, nor the lower global average of one fatality per 60 million miles. Safety data generated by the auto-pilot testing should be opened for customers to review before buying and using the system.
1.39 trillion miles were driven during 2014 in the United States, according to the National Highway Traffic Safety Administration. At that rate, Tesla’s 130-million-miles-without-a-death figure suggests that if everyone drove on Tesla auto-pilot, 10,739.2 people would die annually in the U.S. On the face of it, certainly better than human drivers.
2014 saw a historic low in U.S. driving fatalities, 32,675 deaths, after progressively better results over many years: 10.8 deaths per 100,0000 people, according to the Center For Disease Control’s National Center for Health Statistics. Death rates were headed up almost 10 percent in 2015.
The Tesla auto-pilot seems to perform more safely than average people when measured in terms of fatalities. We don’t know if the system has blind spots that engineers are seeking to plug. We don’t know if it makes more or fewer small, non-fatal mistakes than a driver. If three people had died in the fatal Florida Tesla accident in June, Tesla’s average deaths per miles driven would be just average, only as safe as ordinary simple human drivers.
It’s also important to understand that the 130 million miles of auto-pilot driving Tesla points to is active beta testing with their customers lives in the balance. It’s not like we’re all test pilots in the Chuck Yeager vein, but the fact that the company is sending car owners out with an active auto-pilot system still in the testing stages should concern us.
If the company has broken securities law, which appears may be the case — at least an SEC investigation is said to be underway — its a conservative decision to assume the safety filings may not reflect the full risk the auto-pilot involves. Tesla should take action to increase transparency now.
This is a situation in which all auto-pilot data should be exposed for public inspection, so that customers don’t need to rely on Tesla’s assurances. A corps of data scientists would give consumers a better risk evaluation than a company seeking to lower its liability risk profile. Tesla could make an important contribution to co-development of products with informed customers by opening its beta-stage auto-pilot data to public scrutiny.