Tesla Model S Autopilot Caused a Fatal Accident


On the 7th of May, a man named Joshua Brown, lost his life on the road when cameras on his Tesla Model S failed to detect a tractor trailer and hit it. The car was on an autopilot driving mode at the moment when the accident occurred.


The entire story would be a simple car accident story, only if the NHTSA (National Highway Traffic Safety Administration) didn’t get involved. What are they trying to do is to determine, whether or not the self driving car is safe.

Tesla S self driving car driver died

The driver of the car was a well known Tesla enthusiast, and a friend of the company, as they stated in the official notice. He used to upload videos of his Tesla and even some videos where he showed how Tesla autonomous driving saved his life.

There are no actual details about the accident, except the information that we could get from the Tesla Motors. They said that the incident happened on the divided highway with autopilot engaged. The tractor trailer drove across the highway 90 degrees to the model S. In that moment, neither the sensors nor the driver reacted appropriately and everything resulted in the fatal outcome. The car ran under the trailer and got its windshield and roof completely shared off.

Crashed Tesla model S

The car was found hundred feet from the accident, which tells us that it was moving really fast, and additionally the portable DVD player was found in the car which might have been a distraction for the driver.

This is not the first time that Tesla Model S had trouble seeing high trailers. Earlier this year, a driver reported that his Tesla Model S didn’t stop in front of the tractor trailer when it was summoned with the autonomous parking feature. We have no doubt that Tesla Motors engineers will find some kind of solution, with wider range sensors and better cameras, but we need to keep in mind that this is still the beta testing for that system. Here is the blog post about the accident on the official Tesla website.

Autonomous EV Cars

One should make a difference between “autonomous cars” and an “autopilot option” which Tesla cars have. Google is making a fully autonomous cars and this investigation may be a problem for them and companies that have “autonomous cars” as projects.

Autonomous cars are cars which don’t need any actual driver. They don’t need brake and gas pedals, steering wheels nor shifters. Tesla is not an autonomous car. It has an autopilot option which must be seen as an advanced version of cruse mode. You, as a driver must look at the road and take over if you need. It doesn’t matter if this system is in beta or a full version. You need to watch the road, for your safety and the safety of others. Planes have had autopilots since 1912 and people haven’t changed its use since then. It only serves as an assistant in driving.

Semi autonomous cars, or cars with an autopilot option are a new thing. Humankind was introduced with it, last year. News about the autopilot came as a shock, for the general population, and the drivers of Tesla cars likewise. Since it has been activated for the first time, the autopilot option had some minor flaws and it is not perfect. This is the first accident with the fatal outcome but its development mustn’t be stopped because of this. And this fatal outcome came after more than 130 million miles without any accident.

Every year 1.3 million people lose their lives in car accidents caused by human error, not an autopilot.

Fear from new things has been present in human behavior since the dawn of man. What would have happened if man stopped making fire when he first felt the burn?

They should have their investigation and maybe laws should be made to make people pay additional attention during their use of the “autopilot” option. Even Tesla Motors advice drivers to keep their hands near the steering wheel during the use of the autonomous drive option.