When Robin Geoura bought an electric car in 2017, he was skeptical about the self-driving technology on the Tesla Model S.
“It was a little scary to just sit and drive, relying on it,” he told a US investigator. Tesla An autopilot system that explains his first feelings about technology.
Geoulla commented to investigators in January 2018, a few days after Tesla: Autopilot He bumped into and engaged behind a vacant fire engine parked on the California Interstate. Reuters was unable to contact him for additional comment.
Over time, Georgela’s first question about the autopilot eased, and he found it reliable when tracking the vehicle in front of him. However, according to a record of an interview with a National Transportation Safety Board (NTSB) investigator, I noticed that the system can appear confused when faced with direct sunlight or a vehicle in front of a lane change. rice field.
He was driving in the sun before hitting the fire engine, he told investigators.
The autopilot design freed Geoulla from driving while traveling, and the NTSB found his hands off the steering wheel for almost the entire 30 minutes of technology activation.
A U.S. agency that makes recommendations but lacks enforcement power has previously told National Highway Traffic Safety Administration (NHTSA) regulators after a series of crashes related to autopilot restrictions, potential driver misuse, and technology. I have been asked to investigate potential safety risks. They are deadly.
“We’ve shown that the past has focused on innovation rather than safety, and we hope it’s changing,” NTSB chairman Jennifer Homendi told Reuters. rice field. She said there was no comparison between Tesla’s autopilot and the more rigorous autopilot systems used in aviation, including trained pilots, fatigue-handling rules, and drug and alcohol testing.
Tesla did not answer the questions written about this story.
Autopilot is an advanced driver assistance feature and the current version does not allow the vehicle to be autonomous, the company’s website states. Tesla states that the driver must hold the steering wheel and agree to maintain control of the vehicle before enabling the system.
Geoulla’s 2018 crash is one of 12 autopilot-related accidents that NHTSA authorities have been scrutinizing as part of the authorities’ most extensive investigation since Tesla introduced a semi-autonomous driving system in 2015. It is one.
According to NHTSA statements, NTSB documents, and police reports reviewed by Reuters, most of the crashes under investigation occurred after dark or with limited visibility, such as bright sunlight. According to self-driving experts, this raises questions about the autopilot’s ability to operate in difficult driving conditions.
An NHTSA spokeswoman said in a Reuters statement that “NHTSA has extensive authority over enforcement and deficiencies and will act if it detects an unreasonable risk to public security.”
Since 2016, U.S. vehicle safety regulators have individually dispatched 33 special collision investigation teams to investigate Tesla’s collisions, including the deaths of 11 people suspected of using advanced driver assistance systems. bottom. NHTSA has ruled out using the autopilot in three of these non-fatal crashes.
Autopilot’s current NHTSA investigation effectively resumes the question of whether the technology is safe. this is, Elon Musk, Tesla’s CEO and advocate for self-driving cars, has allowed his company to become the most valuable automaker in the world.
Tesla will charge customers up to $ 10,000 for advanced driver assistance features such as lane changes, and will eventually provide the car with autonomous driving capabilities using only cameras and advanced software. I promise that. Other automakers and self-driving companies are using more expensive hardware, such as radar and LiDAR, in their current and future vehicles, as well as cameras.
Musk states that Tesla with eight cameras is much safer than a human driver. However, camera technology is affected not only by darkness and glare of the sun, but also by bad weather such as heavy rain, snow and fog, experts and industry executives say.
Raj Rajkumar, a professor of electrical and computer engineering at Carnegie Mellon University, said:
In 2016, the first known fatal US crash related to Tesla’s semi-autonomous driving technology, west of Williston, Florida, the company said the company had a brightly lit sky for both drivers and autopilots. Said he couldn’t see the white side of the tractor trailer. Instead of braking, Tesla collided with an 18-wheel truck.
Driver misuse, braking failure
In January 2017, NHTSA completed an autopilot investigation due to its deadly clash, after several controversial exchanges with Tesla officials, according to a document reviewed by Reuters. No flaws were found in the performance of the autopilot.
In December 2016, as part of its investigation, the agency raised to Tesla the internal safety issues raised regarding the autopilot, including potential driver misuse and abuse, in accordance with a special order sent by the regulatory agency to the automaker. We asked you to provide details of the company’s response to your concerns. ..
After an NHTSA lawyer discovered that Tesla’s initial response was lacking, Tesla’s then-advisory lawyer, Todd Maron, tried again. He told regulators that the demands were “very broad” and it was impossible to catalog all the concerns raised during the development of the autopilot, according to a newsletter reviewed by Reuters.
Nevertheless, Tesla wanted to cooperate, Maron told regulators. During the development of the autopilot, company employees or contractors raised concerns that Tesla addressed regarding the potential for unintended or unsuccessful braking and acceleration. Unwanted or unsuccessful steering; Maron said no details were given about certain types of misuse or abuse by the driver.
Maron did not respond to the message asking for comment.
It’s not clear how regulators responded. A former US official said Tesla generally worked with probes to quickly produce the requested material.Regulators closed the investigation shortly before the former U.S. president Donald Trump’s Inaugurated, autopilot was executed as designed, and it was discovered that Tesla took measures to prevent misuse.
Leadership blank at NHTSA
NHTSA had no confirmed chief in the Senate for nearly five years.President Joe Biden I haven’t nominated a person to run the agency yet.
According to NHTSA documents, regulators want to know how Tesla’s vehicles are looking at the flashing lights of emergency vehicles, or are trying to detect the presence of fire, ambulance, and police cars. increase. The agency is asking for similar information from 12 rival car makers.
“Tesla has been asked to create and validate the data and its interpretation. NHTSA will carry out its own validation and analysis of all information,” NHTSA told Reuters.
Musk, a pioneer of electric vehicles, has fought hard to protect autopilots from critics and regulators. Tesla used the autopilot feature to wirelessly update its vehicle software, surpassing and avoiding the traditional vehicle recall process.
Musk has repeatedly touted the autopilot’s capabilities, but the owners tell critics to stay involved with the driver and outline the limits of the technology in a way that misleads customers into thinking that Tesla can drive on their own. Despite the opposite warning in the manual.
Musk also continues to launch what Tesla calls a beta or unfinished version of its “fully autonomous driving” system through wireless software upgrades.
“Some manufacturers intend to do what they want to do to sell their cars, and it’s up to the government to curb it,” said NTSB Homendy.
© Thomson Reuters 2021
https://gadgets.ndtv.com/transportation/features/tesla-autopilot-safety-concern-regulator-ntsb-nhtsa-us-elon-musk-2549415#rss-gadgets-all Tesla Autopilot Safety: Life-and-Death Issues for Regulators