Federal authorities are investigating whether a Tesla was involved in an accident that killed three people and injured three others last week in Newport Beach with its autopilot system activated at the time of the collision.
A special accident investigation team was dispatched for the May 12 incident on the Pacific Coast Highway, the National Highway Traffic Safety Administration said Wednesday.
In that incident, Newport Beach Police were called around 00:45 to block 3000 of Pacific Coast Highway where they found a 2022 Tesla Model S sedan. crashed into a sidewalk and hit a construction machinery.
Three people were found dead in the Tesla; they were identified last week as Crystal McCallum, 34, of Texas; Andrew James Chaves, 32, of Arizona; and Wayne Walter Swanson Jr., 40, of Newport Beach, according to the Orange County Sheriff’s Department.
Three construction workers sustained non-fatal injuries, police said, adding that the department’s serious accident investigation team had been involved.
Tesla, which dissolved its media relations department, did not respond Wednesday to a Times request for comment on the NHTSA investigation into the Orange County incident.
The federal investigation is part of the agency’s broader investigation of accidents involving advanced driver assistance systems such as Tesla’s autopilot. Investigators have been sent to 34 crashes since 2016 where systems were in use or suspected of working; 28 of those Tesla involved, according to an NHTSA paper released Wednesday.
In those 34 incidents, 15 people were killed and at least 15 others were injured, and all but one deaths occurred in accidents involving Tesla, according to the document.
The NHTSA told the Times Wednesday evening that it did not comment on the open investigation.
In addition to these crashes, NHTSA is investigating several incidents where Tesla is using autopilot crashed into emergency vehicles parked along the streets despite flashing lights or warning cones, as well as a number of complaints triggered by the autopilot system High-speed “ghost braking” for no apparent reason.
NHTSA is also investigating two incidents involving Volvo, a Navya shuttle accident, two involving Cadillac, one in a Lexus and one in a Hyundai. One of the Volvo crashes was an Uber autonomous test vehicle that hit and killed an Arizona pedestrian in March 2018.
In Los Angeles County, the district attorney’s office filed a petition in January what experts believe is the first criminal case in the United States of a driver accused of causing death while using a partially automated driving assistance system.
The accusations came two years after the accident in Gardena.
Kevin George Aziz Riyadh, 27, was behind the wheel of a 2016 Tesla Model S with autopilot on December 10 On January 29, 2019, when he pulled off a highway, passed a red light and crashed into a Honda Civic .
The Civic driver, Gilberto Alcazar Lopez, and his passenger, Maria Guadalupe Nieves-Lopez, were killed instantly.
Riyadh faces two counts of road murder.
Tesla has warned drivers using autopilot, as well as its so-called Full Self-Driving system, that cars cannot drive themselves and that drivers must be ready to intervene at all times.
Last June, NHTSA ordered dozens of auto and tech companies to do so report crash data on automated vehicles in order to better monitor their safety.
No commercially available motor vehicle can drive completely on its own, the agency said. Tesla’s autopilot feature is classified as a “Level 2” vehicle range, meaning the vehicle can control steering and acceleration, but a human in the driver’s seat can take control at any time.
“If a [Level 2] the automated driving system is activated or not, every available vehicle requires the human driver to be in control at all times, and all state laws hold the human driver accountable for the operation of their vehicles, “according to an NHTSA spokesperson.” Some advanced features Driver assistance systems can promote safety by helping drivers avoid accidents and mitigate the severity of accidents that occur, but as with all technologies and equipment on motor vehicles, drivers must use them correctly and responsibly. “
Many legal experts are clear that the responsibility for level 2 systems like Autopilot falls directly on the driver, not on the marketing technologies of the companies that could lead consumers to believe that the features are more effective than they are.
But the California Department of Motor Vehicles is struggling with confusion about Tesla’s Full Self-Driving feature, a state-of-the-art version of Autopilot destined to do in the end just what the name implies: provide full autonomy, to the point where no human being is required to drive.
While other self-driving car developers, such as Waymo and Argo, use trained test drivers who follow strict safety rules, Tesla is conducting its tests using its own customers, charging car owners $ 12,000 for the privilege.
Other autonomous technology companies are required to report crashes and system failures to the DMV as part of its trial authorization system, but the agency has allowed Tesla to waive those regulations.
After lobbying from state lawmakers, prompted by scary videos on YouTube and Twitter highlighting Full Self-Driving’s poor performance, the DMV said in January that it was “Revisit” its position on Tesla technology.
The agency is also conducting a review to determine if Tesla is violating another DMV regulation with its fully autonomous driving systems, one that prevents companies from marketing their cars as autonomous when they are not.
Times staff writer Russ Mitchell and the Associated Press contributed to this report.