fsd safety claims under scrutiny

With Tesla claiming its Full Self-Driving system is nine times safer than human drivers, the company’s safety statistics are facing serious scrutiny. Tesla reports one crash per 6.69 million miles when Autopilot or FSD is active. That’s dramatically better than the U.S. national average of one crash per 702,000 miles. But there’s a catch—experts question how Tesla’s numbers are collected and compared.

Tesla’s crash statistics appear nine times safer than human drivers, but experts question the reliability of how the company collects and compares this data.

Tesla only counts crashes reported to police. Minor accidents that don’t get police reports aren’t included. That’s a major problem because most everyday fender-benders don’t make it into official records. The national average, however, includes all reported crashes. This difference means Tesla’s comparison might not be apples-to-apples.

Another issue is how Tesla measures its non-Autopilot baseline. In 2023, Tesla reported one crash per 1.71 million miles for non-Autopilot driving. But in Q2 2025, that number dropped to 963,000 miles per crash. That’s a huge shift. Experts note that Tesla’s non-Autopilot numbers are still better than the national average, which suggests Tesla owners might be safer drivers overall, not that the company’s data collection is reliable.

Tesla also lumps together basic Autopilot and Full Self-Driving without separating the data. The two systems work differently, so combining them makes it hard to know which system is actually safer.

Then there are the fatal accidents. The National Highway Traffic Safety Administration is investigating at least four crashes where FSD was engaged. Some involved pedestrian deaths. While investigators haven’t confirmed FSD caused these crashes, the investigations show the system isn’t perfect. According to available data, 734 Tesla-related deaths have been reported and documented, with 59 specifically involving Autopilot or FSD systems. Approximately 90% of crashes involve human error factors like distraction or poor lane discipline, which Autopilot aims to address through active monitoring systems.

Tesla continues to polish its safety reporting methods, which means past data comparisons aren’t always valid. The company’s claims look impressive on paper. But independent experts say Tesla needs better data collection methods and clearer comparisons with real-world driving. Until then, whether FSD is truly safer than humans remains an open question.