Hacked Tesla FSD computers disclose alarming raw data on deadly Autopilot accidents

王林
Release: 2024-07-31 06:53:12
Original
934 people have browsed it

Hacked Tesla FSD computers disclose alarming raw data on deadly Autopilot accidents

While the latest Tesla FSD 12.5 update is rolling out to the public, two major business publications have posted negative stories about Tesla's driver-assist systems.

The Bloomberg one cites anecdotal evidence from a sell-side analyst who detailed his experience during Tesla FSD demo drives in a note to clients. He had to intervene a few times in peculiar situations, like obeying hand signals from a traffic officer, but also in more mundane scenarios.

The Tesla Model Y repeatedly crossed uninterrupted lane markings, for instance, or sped into an intersection because the computer misread a situation where another car was only halfway through a right turn. The Model Y was driving on FSD 12.3.6, though, and the analyst did note some marked improvements over the previous version they tested earlier this year.

The other negative article about Tesla's self-driving vehicle ambitions, however, levels much more serious accusations and is padded with exclusive evidence. The Wall Street Journal investigators sourced FSD computers from salvaged Teslas and sent them to hackers to extract the raw Autopilot data.

Tesla keeps this kind of data a trade secret otherwise, since it shows how its driver-assist algorithms think and react on the fly as they rely solely on interpreting the input of a few Tesla Vision cameras.

The learnings from the raw Autopilot decision-making data grab were then paired with accident footage from Tesla cars' cameras. The WSJ also matched individual state reports against the federal database of accidents that the NHTSA maintains for its own crash investigation, and managedto reenact 222 Tesla crashes.

In 44 of those, the accidents occurred when Tesla cars on Autopilot "veered suddenly," while 31 happened when the vehicles "failed to stop or yield." The investigation found that the latter led to the most severe accidents with Teslas driving on Autopilot.

Experts, who saw the footage of such crashes and the way that the Autopilot system works algorithmically, said that it will take time to train it on everything that happens on the roads. One of the deadly accidents, for instance, was due to a failure to recognize an overturned double trailer blocking the highway.

The system didn't know what this was, so it smashed at full speed into the trailer. There are plenty of examples where Autopilot gets bamboozled by the lights of emergency vehicles and slamming into them, too.

Overall, the findings point out that self-driving Teslas crash for both hardware and software reasons. The issues mentioned span from slow algorithm updates to insufficient camera calibration. However, more in-depth coverage from independent sources may be needed before this Autopilot data leak manages to challenge Elon Musk's main assumption, that Tesla's self-driving feature is ultimately safer than human drivers.

Get the 80A Tesla Gen 2 Wall Connector with 24' cable on Amazon

The above is the detailed content of Hacked Tesla FSD computers disclose alarming raw data on deadly Autopilot accidents. For more information, please follow other related articles on the PHP Chinese website!

source:notebookcheck.net
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!