Tesla is adding a driving safety test to Full Self-Driving after a number of regulators and Tesla jobs and vehicle customers expressed concerns about the Autopilot and Full Self-Driving software. In order to use the program, drivers must complete a seven-day driver evaluation test. 

This requirement will be implemented with FSD Beta Version 10.1, which the business claims offer enhancements over prior versions such as Version 9.2, which even Elon Musk conceded was riddled with faults. 

After a few accidents involving a Tesla vehicle with Autopilot turned on and an emergency vehicle, the National Highway Traffic Safety Administration (NHTSA) in the United States has raised particular worry regarding the safety of Tesla’s driver-assist services.

Members of Congress have also asked the FTC to look into Tesla’s Autopilot claims, which even some of Tesla’s software developers agree are inflated in communications with California’s DMV. 

Following a fatal bicycling accident involving a Tesla vehicle, Elon Musk has admitted that he ordered Tesla to prioritize Autopilot development. The driver admits to falling asleep at the wheel, according to court records filed in a complaint against Tesla. 

The case was, of course, quickly dismissed, but especially highlighted statistics compiled by the NHTSA itself showing that driver error is a factor in an estimated 94% of vehicle crashes. 

Tesla’s Autopilot has since proved the ability to safely pull over to the side of the road if the driver loses consciousness in a real-world driving setting. Tesla’s internal sensors may be fooled by strategically placing a few weights on the driver’s seat and steering wheel, but the company recently activated a camera positioned in the rear-view mirror to track driver attention. 

Musk has admitted that completely capable self-driving software is more challenging to construct than he had anticipated. As a result, Tesla has frequently warned drivers not to get complacent when utilizing its driver assistance systems. 

Musk, on the other hand, appears adamant about pushing Autopilot, Full Self-Driving, or both to Level 5 on the SAE’s scale of autonomous vehicles. This would necessitate software that could handle most driving conditions while only requiring user input once every million kilometers traveled. 

By warning drivers to avoid growing complacent while using Full Self-Driving, the additional driver evaluation requirement will offer a new layer of safety (and possibly lessen Tesla’s exposure to liability). The test comprises a number of characteristics that can be detected by sensors in the vehicle, such as the frequency of forceful braking and violent turning. 

According to updated information on Tesla’s website, the data will be compiled into a “safety score” that measures the likelihood that driver behavior will contribute to a future collision. 

Tesla hasn’t indicated whether customers will lose access to Full Self-Driving if their safety score drops below a specific threshold after the initial seven-day assessment. Although Musk has claimed that his ultimate objective is for Full Self-Driving to be able to control a fleet of autonomous “Robotaxis,” this is still a long way off. 

Source: redOrbit