Jennifer Homendy, chairman of the National Transportation Safety Board, said Tesla must address “basic safety issues” before the forehead expands into its “fully autonomous driving” (FSD) mode. Tesla CEO Elon Musk said earlier this month that the company planned a broader FSD release by the end of September, making a “public beta button” available to more Tesla customers.
As the WSJ reports, an update to the software that was primarily designed for driving on highways in order to prepare vehicles for driving on city streets. Homendy cited harsh words for Tesla’s use of the term “fully autonomous driving”, which she called “misleading and irresponsible advertising”, adding further that Tesla “clearly misled several people into the misuse and abuse of technology.” The NTSB can conduct investigations and make recommendations, but it has no enforcement authority.
According to documents obtained by the legal transparency group PlainSite in May, Tesla’s director of Autopilot software told the California Department of Motor Vehicles that Musk had exaggerated the capabilities of the company’s advanced driver assistance system.
In February 2020, the NTSB discovered that Tesla’s autopilot driver assistance system was one of the possible causes of the fatal crash in 2018, claiming the driver, who was playing a mobile phone game while using the pilot. automatic, was overly confident in the capabilities of the autopilot.
The NTSB claims that Tesla ignored its 2017 safety recommendations on autopilot. The agency told Tesla and five other automakers that they should add safeguards to advanced driver assistance systems to make them more difficult to misuse.
NTSB also recommended that automakers limit where and when such driver assistance systems can be used. Tesla was the only automaker not to officially respond to the NTSB’s recommendations, although it has increased the frequency of alerts if a driver takes his hands off the wheel when using autopilot.