Autopilot, a driving support system installed in Tesla cars, is a function that assists in speed and lane steering according to traffic conditions, but recently, traffic accidents caused by Tesla vehicles that have selected Autopilot have become a problem.
In the midst of this, the MIT research team investigated how much the driver squints when the autopilot in the Tesla vehicle is selected based on driving data of over 800,000 km.
Autopilot is a function that assists some driving operations to the last, and the driver must hold the steering wheel. However, there are reports of situations in which the driver overconfidence in the autopilot and touches the smartphone while driving or tricks the sensor to leave the driver’s seat, and the autopilot may misunderstand the lane. In August 2021, a Tesla car using Autopilot collided with a stopped emergency vehicle one after another.
Tesla, meanwhile, repeatedly claims that Autopilot reduces traffic accidents in its regular safety reports. In a tweet citing a report released in April 2021, CEO Elon Musk said that the chance of an autopilot-assisted Tesla car having an accident was close to 1 in 10 on average.
As part of continuous research on vehicle driving and technology, the research team collected data that confirmed where the driver’s gaze was directed by analyzing the driver’s posture and facial position while driving the Tesla Model S and Model X. In more than 800,000 km of driving data collected since 2016, we investigated how the gaze changes when the autopilot is turned off and on.
As a result of the survey, it was found that the percentage of drivers who kept their eyes on a location unrelated to driving operation for more than 2 seconds was 4% when autopilot was turned off, but it increased to 22% when autopilot was turned on. When Autopilot is off, drivers are more likely to turn their eyes to the side mirror or rearview mirror, but when Autopilot is on, the amount of time they stare at the dashboard center screen, etc. increases.
The team points out that these behavioral changes can be caused by a misunderstanding of the system and its limitations, and that this is reinforced by good driving automation. The autopilot only plays a role of supporting driving, but the driver is overconfident of the autopilot, so if he chooses it, he obviously doesn’t pay attention to driving.
Regarding the problem of changing the driver’s gaze depending on the autopilot on/off, the research team suggests that the autopilot should monitor not only the road but also the driver’s posture and gaze, and in some cases warn. The report points out that the same system will be needed for Tesla’s automatic driving, given that the GM driving assistance systems Super Cruise and Nissan Propilot 2.0 detect the driver’s gaze by infrared rays and take a structure to detect and warn of distractions. Related information can be found here.
Meanwhile, Tesla CEO Elon Musk tweeted on September 25 that he was releasing FSD Beta 10 to more customers, the latest firmware version that will enable fully automatic driving. Drivers with access to the beta will need to pass a Tesla safety review, he said.
The FSD beta was announced in October 2020 and is only available to a very small number of professionals and attentive drivers. Tesla also announced that in July 2021, a paid service with FSD will be offered for $199 per month. Previously, it had bought and sold licenses for $10,000.
However, the US consumer group Consumers Union points out that there are still many problems to be called fully automatic driving, such as valuing Tesla’s fully automatic driving function as a name and not worth paying a high price. In fact, in the FSD beta 9 delivered by Tesla in July 2021, there are still questions about safety, such as reports that the vehicle will rush.
As a result, the National Transportation Safety Board said that basic safety issues must be addressed before Tesla enters fully automatic driving. Some criticized Tesla’s use of the term “fully automatic driving” as misleading and irresponsible.
Meanwhile, CEO Elon Musk said that the page requesting the FSD beta will be released on September 25, 2021, but FSD beta 10.1 will be released soon because a 24-hour test is required. In response to safety concerns, Tesla estimated the likelihood of user driving leading to a crash in the future from driving data for 7 days before offering the FSD beta as five checkpoints, calculating a safety score, and only passing the FSD beta to users We have a policy of allowing access.
The five checkpoints that Tesla said yield a safety score out of 100, Tesla said that most drivers have a safety score of 80 or higher. The check points are the number of forward collision warnings per 1,600 km, the rate of sudden braking as if experiencing a follow-up acceleration of 0.1G or more, excessive handling such as experiencing a left-to-right acceleration of 0.4G or more, the time it takes for the vehicle to come to a sudden stop in the future, and It is whether or not it is forcibly switched to automatic mode because it is judged to have caused neglect. Related information can be found here.