
Autopilot, Tesla’s advanced driver assistance system, is again involved in a fatal accident. A Tesla car with Autopilot activated fatally hit a motorcyclist from behind in Draper, Utah, USA. The driver of the Tesla told investigators he did not see the motorcyclist, suggesting he was distracted and had placed all his trust in Autopilot. Fatal incidents involving Tesla vehicles are piling up, and US authorities are investigating these events to determine the level of Autopilot involvement, but also whether the software is as safe as Tesla claims.
Fatal accidents involving Tesla’s Autopilot have increased in recent years. The latest was Draper where a Tesla driver rammed a motorcyclist from behind on Sunday morning, killing the motorcyclist instantly. Utah Highway Patrol said the car was on Autopilot at the time of the crash. The crash comes days after the National Highway Traffic Safety Administration (NHTSA) opened a special investigation into a fatal crash in which a 2021 Tesla Model Y killed a motorcyclist in California earlier this month.
Indeed, Autopilot is an advanced driver assistance system that is supposed to improve safety and comfort at the wheel. However, the name “Autopilot” has always misled Tesla drivers, causing them to get distracted while driving, even falling asleep at times. Even when Autopilot is supplemented with the “Full Self-Driving” (FSD) option, it’s far from a perfect system, as it fails to pull off many simple maneuvers that a human driver can easily pull off. For example, earlier this month, a YouTuber demonstrated how Tesla’s Autopilot zooms his Model 3 into an oncoming streetcar.
Heading for the oncoming train is just one of the many serious cases reported by the youtuber. In a video showing tests he performed, we can see the serious errors made by the Autopilot. Here is a non-exhaustive list:
- the Tesla almost ran into a barrier (which indicated that the road is blocked, 07:06);
- the Tesla has chosen the wrong path and is visibly confused (confre the display on the dashboard, 11:06);
- the Tesla tried to run a red light, while the cars were moving (12:12);
- the Tesla stopped in the middle of an intersection for no reason (13:09);
- the Tesla chose the wrong lane for a left turn (1:25 p.m.);
- the Tesla constantly turned the left turn signal on and off for no reason (in a place where it was not even allowed to turn left, 15:02);
- the Tesla failed to turn left properly and almost hit pitons (17:13).
Last month, NHTSA released a report that Tesla cars running Autopilot were involved in 273 reported crashes last year. Of 367 accidents reported between July 2021 and May 15, 2022, Tesla vehicles accounted for three-quarters of all accidents involving an driver assistance system (ADAS). The news provides even more data undermining claims that Autopilot is a secure system. Despite the name of its driver assistance system, Tesla has been forced by authorities to remind drivers to be alert on the road and to keep their hands on the wheel.
“Even if your vehicle is equipped with driver assistance or autopilot features, all vehicles require the driver to always be alert and watch the road,” said Michael Gordon of the Utah Highway Patrol on Sunday. The investigation opened by NHTSA earlier this month is the 38th special investigation into a crash involving a Tesla vehicle since 2016. Of those crashes, 18 were fatal. According to NHTSA authorities, the latest investigation, like most others, is investigating whether or not Autopilot, Tesla’s advanced driver assistance system, was in use at the time of the crash.
The agency declined to comment on the case, as it is still ongoing. But local media reported that on July 7, a 48-year-old motorcyclist was killed after a collision on the Riverside Freeway in Riverside, California. He was driving in the HOV lane and was approached from behind by the Tesla. NHTSA has also opened a special investigation into another fatal Tesla crash in Florida, which killed a 66-year-old Tesla driver and a 67-year-old passenger. In May, the agency began investigating a crash involving a 2022 Tesla Model S that killed three people.
Last month, NHTSA expanded its investigation into Tesla’s Autopilot system after a series of rear-end collisions across the United States. Several Tesla vehicles crashed into parked emergency vehicles. Since the beginning of the investigation last August, the NHTSA has identified six other accidents of this type. The most recent incident occurred in January. The American automaker had installed a software update on its vehicles in September, which should enable Autopilot to recognize emergency vehicles even in difficult lighting conditions.
The special investigation now covers approximately 830,000 vehicles from the four current model series from the years 2014 to 2022. The investigation also examines how Autopilot exacerbates human error. According to Tesla, Autopilot is only an assistance system, so the driver must keep their hands on the wheel at all times. You must also be ready to regain control of the vehicle at any time. Tesla CEO Elon Musk says Autopilot makes driving safer and helps prevent accidents. However, damning reports on Autopilot tend to show otherwise.
Each year, NHTSA conducts an average of more than 100 special accident investigations to investigate new technologies and other potential safety issues in the automotive industry. Last February, Tesla had to recall 53,822 vehicles equipped with Autopilot, which can allow certain models to carry out a “rolling stop” (rolling stop) and not to stop completely certain intersections, which poses a risk for security. NHTSA said the recall affects certain 2016-2022 Model S and Model X, 2017-2022 Model 3 and 2020-2022 Model Y vehicles. She invited Tesla to deactivate this function.
A few days ago, the Munich I District Court ordered Tesla to reimburse a customer most of the purchase price of 112,000 euros for a Model X SUV. In the judgment, the court relied on a report, which has not yet been made public, that the assistance system does not reliably recognize obstacles – such as the narrowing of a construction site. In addition, the car brakes again and again unnecessarily. This could mean a massive risk in city centers in particular and lead to rear-end collisions.
The argument of Tesla’s lawyers, according to which Autopilot was not intended for urban traffic, was not accepted by the court. “Once again, this shows that Tesla is not delivering on its Autopilot promises,” plaintiff attorney Christoph Lindner said. In a July 2020 ruling, a Munich court banned Tesla Germany from repeating misleading statements about the capabilities of its Autopilot technology. According to the judgment, Tesla will no longer be able to include the expressions full potential of autonomous driving or autopilot included in its German advertising materials.
And you?
What is your opinion on the subject?
What do you think of Tesla’s Autopilot?
In your opinion, is the software reliable? Does it really improve safety on the road?
What do you think explains the high confidence Tesla drivers have in Autopilot?
See as well
Tesla’s Autopilot practically slams a Model 3 into an oncoming streetcar, Full Self-Driving option was on
Tesla vehicles running Autopilot were involved in 273 reported crashes since last year, according to National Highway Traffic Safety data.
Tesla is recalling nearly 54,000 vehicles likely to violate stop signs, 2016-2022 Model S and Model X, 2017-2022 Model 3 and 2020-2022 Model Y are affected
Munich court orders Tesla to reimburse customer for problems with Autopilot, after finding safety flaws in automaker’s technology