This article is reprinted with the authorization of AI New Media Qubit (public account ID: QbitAI). Please contact the source for reprinting.
An investigation into Tesla caused a uproar on social media.
Some netizens even used "epic" to describe the findings of this investigation:
This The investigation comes from the US Highway Safety Administration(NHTSA), and among the exposed documents, one sentence stands out. :
Autopilot aborted vehicle control less than one second prior to the first impact.
在impact In the first less than 1 second, Autopilotsuspended control of the vehicle.
What NHTSA discovered was like a bombshell thrown to the public.
Because it is like "the robot hands over the steering wheel to you before the car hits you"...
Then many netizens commented Questioned:
With this design, Tesla can deny that the accident was due to Autopilot?
Ah, this...
NHTSA’s investigation into Tesla began in August last year.
The direct reason for launching the investigation is that more than one Tesla entered the scene of an accident with the automatic driving system Autopilot turned on, and collided with Mingming who had already stopped on the road. Emergency vehicles, police cars or accident vehicles nearby.
You won’t know if you don’t check, but you will be shocked if you check.
They found that in 16 such accidents, most of the vehicles' Autopilot had activated forward collision warning# before the collision. ##(FCW). Then the automatic emergency braking(AEB) of about half of the vehicles also actively intervened.
But they all failed in the end(One person died unfortunately).
The scary thing is that, on average, in these 16 accidents, Autopilot stopped controlling the carless than one second before the actual impact.
And this didn’t give the human driver enough time to take over. The video of the accident shows that human drivers basically noticed the accident scene ahead 8 seconds before the collision.But court data from 11 of the accidents showed that no human driver took evasive measures in the 2-5 seconds before the collision, although they all kept their hands on the steering wheel as required by Autopilot.
Perhaps most drivers still "trusted" Autopilot at that moment - the drivers of 9 of the vehicles did not respond to the visual or audible warnings issued by the system in the last minute before the collision. react.
However, four vehicles did not give any warning at all.
Now, in order to further understand the safety of Autopilot and related systems(To what extent it can undermine the supervision of human drivers and increase the risk), NHTSA decided Upgrade this preliminary investigation to an engineering analysis (EA).
The vehicles involved will be expanded to all four existing Tesla models: Model S, Model X, Model 3 and Model Y, totaling 830,000 vehicles.
Once the results of this investigation were exposed, victims began to come forward to give their own opinions.
A user said that as early as 2016, her Model S also hit a car parked on the side of the road while changing lanes. But Tesla told her it was her fault because she hit the brakes before the crash, causing Autopilot to lose control.
The woman said that the system issued an alarm at this time. It means that whatever I did was wrong. I didn't apply the brakes and it was because I didn't pay attention to the alarm.
However, Musk did not respond to this matter. He just happened to send a tweet 6 hours after netizens broke the news, throwing out NHTSA as early as 2018An investigation report on Tesla.
The report states that Tesla’s Model S(produced in 2014) and Model X( Produced in 2015) All previous tests by NHTSA have proven that these two models have the lowest probability of injury after an accident among all cars.
Now they discovered that the new Model 3 (produced in 2018) actually replaced Model S and Model X in the first place.
△ Tesla also took the opportunity to make a wave of publicity
This is a response to the "car crash and unknown accident" in the Tesla car accident in Shanghai a few days ago. Injury" incident.
What’s interesting is that just when everyone was denouncing Tesla’s autopilot as unreliable, someone stood up and called everyone’s fault for Tesla’s “darkness.”
In fact, Tesla interrupting Autopilot 1 second before the accident does not put the blame on humans. They are counting In the event of a Tesla accident, as long as Autopilot is working within 5 seconds of the collision, the responsibility will be assigned to Autopilot.
Later, Tesla officials also came out to confirm this rule.
However, although humans no longer need to take the blame, the "saucy operation" of cutting off the autonomous driving one second before the accident still cannot change the fact that the human driver had no time to take over the steering wheel.
After reading NHTSA’s investigation results, let’s look back at Autopilot, which is the focus of public opinion this time.
Autopilot is an advanced driving assistance system (ADAS) from Tesla, which is under the international Society of Automotive Engineers(SAE) Proposed L2 in the autonomous driving level.
(SAE divides autonomous driving into six levels, from L0 to L5)
According to Tesla’s official description, Autopilot currently Features include automatic assisted steering, acceleration and braking within a lane, autonomous parking, and the ability to "summon" the car from a garage or parking space.
So does this mean that the driver can be fully "managed"?
it's not true.
Tesla’s current Autopilot can only play an “auxiliary” role, not fully autonomous driving.
Moreover, the driver is required to "actively" and "actively" supervise Autopilot.
But in the official introduction, Tesla also gave a little description of the "fully autonomous driving" capability:
All new Teslas have the hardware needed for full self-driving in almost all situations in the future.
Able to carry out short and long distance travel without the need for a driver.
#However, in order for Autopilot to achieve the above goals, it is very important that it is far superior to humans in terms of safety.
In this regard, Tesla said it has proven this in billions of miles of experiments.
AndMusk has also spoken out about Tesla’s safety more than once:
Tesla’s full safety The safety level of autonomous driving is much higher than that of ordinary drivers.
But is this true?
Regardless of the evaluations given by Tesla and Musk, from a practical perspective, Tesla’s Autopilot has been controversial in terms of safety.
For example, the frequent "ghost brake" incident has pushed it to the forefront of public opinion again and again, and is also one of the main reasons why NHTSA launched this investigation. one.
"Ghost braking" refers to when the driver turns on the Tesla Autopilot assisted driving function, even if there are no obstacles in front of the vehicle or will not collide with the vehicle in front, the Tesla vehicle will Unnecessary emergency braking will be performed.
This brings huge safety risks to drivers and other vehicles on the road.
Not only that, if you take a closer look at the hot events related to Tesla, it is not difficult to find that many of them are related to the safety of Autopilot:
So what does NHTSA think of this?
In this document, NHTSA reminds that there are currently no fully autonomous cars on the market. “Every vehicle requires the driver to be in control at all times, and all state laws require drivers to control their vehicles.” Responsible for the operation of the company."
As for what impact this investigation will have on Tesla in the future, according to NHTSA:
If there are safety-related defects, The right to issue a "recall request" letter to the manufacturer.
......
Finally, do a small survey - do you trust Autopilot?
Full report:
https://static.nhtsa.gov/odi/inv/2022/INOA-EA22002-3184.PDF
The above is the detailed content of Tesla is facing huge doubts: Autopilot automatically exited 1 second before the car accident. For more information, please follow other related articles on the PHP Chinese website!