Honolulu Star-Advertiser

Sunday, December 15, 2024 76° Today's Paper


Top News

U.S. likely to cite Tesla crashes with automated systems

ASSOCIATED PRESS
                                This October 2019, photo shows a Tesla logo in Salt Lake City. The government plans soon to release data on collisions involving vehicles with autonomous or partially automated driving systems that will likely single out Teslas for a disproportionately high number of such crashes.

ASSOCIATED PRESS

This October 2019, photo shows a Tesla logo in Salt Lake City. The government plans soon to release data on collisions involving vehicles with autonomous or partially automated driving systems that will likely single out Teslas for a disproportionately high number of such crashes.

DETROIT >> The government will soon release data on collisions involving vehicles with autonomous or partially automated driving systems that will likely single out Tesla for a disproportionately high number of such crashes.

In coming days, the National Highway Traffic Safety Administration plans to issue figures it has been gathering for nearly a year. The agency said in a separate report last week that it had documented more than 200 crashes involving Teslas that were using Autopilot, “Full Self-Driving,” Traffic-Aware Cruise Control or some other of the company’s partially automated systems.

Tesla’s figure and its crash rate per 1,000 vehicles was substantially higher than the corresponding numbers for other automakers that provided such data to The Associated Press ahead of NHTSA’s release. The number of Tesla collisions was revealed as part of an NHTSA investigation of Teslas on Autopilot that had crashed into emergency and other vehicles stopped along roadways.

Tesla does have many more vehicles with partly automated systems operating on U.S. roads than most other automakers do — roughly 830,000, dating to the 2014 model year. And it collects real-time data online from vehicles, so it has a much faster reporting system. Other automakers, by contrast, must wait for reports to arrive from the field and sometimes don’t learn about crashes for months.

In a June 2021 order, NHTSA told more than 100 automakers and automated vehicle tech companies to report serious crashes within one day of learning about them and to disclose less-serious crashes by the 15th day of the following month. The agency is assessing how the systems perform, whether they endanger public safety and whether new regulations may be needed.

General Motors said it reported three crashes while its “Super Cruise” or other partially automated systems were in use. The company said it has sold more than 34,000 vehicles with Super Cruise since its debut in 2017.

Nissan, with over 560,000 vehicles on the road using its “ProPilot Assist,” didn’t have to report any crashes, the company said.

Stellantis, formerly Fiat Chrysler, said it reported two crashes involving its systems. Ford reported zero involving its “Blue Cruise” driver-assist system which went on sale in the spring, though Ford wouldn’t say if there were crashes with less-capable systems.

GM said the three crashes weren’t the fault of Super Cruise. It also reported two crashes that happened before the June 2021 order, a spokesman said.

Several automakers and tech companies, including Toyota and Honda, declined to release their numbers before the NHTSA data is revealed.

A message was left seeking comment from Tesla, which has disbanded its media relations department. NHTSA wouldn’t comment on the data today.

Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University who studies automated vehicles, said he wouldn’t be surprised if Tesla was found to have had a high number of crashes involving its driver-assist systems. Tesla, based in Austin, Texas, stopped using radar in its system and instead relies solely on cameras and computers — a system that Rajkumar calls “inherently unsafe.”

The system’s computer, he said, can recognize only what’s in its memory. Flashing lights on an emergency vehicle, Rajkumar said, might confuse the system, as would anything that the computer hasn’t seen before.

“Emergency vehicles may look very different from all the data that the Tesla software had been trained on,” he said.

In addition to the publicly released crash data, NHTSA has sent investigative teams to far more incidents involving Teslas using electronic systems than other automakers. As part of a larger inquiry into crashes involving advanced driver assistance systems, the agency has sent teams to 34 crashes since 2016 in which the systems were thought to have been in use. Of the 34 crashes, 28 involved Teslas, according to a NHTSA document.

NHTSA said in documents that it has received 191 reports of crashes involving Teslas on Autopilot and nonemergency vehicles, plus 16 more involving parked emergency vehicles or those with warning lights, for a total of 207. Of the 191, the agency removed 85 because of actions of other vehicles or insufficient data to make a firm assessment of the crashes. That left 106 that were included in the Autopilot investigation.

It wasn’t clear if 207 matched the total number of Tesla crashes reported to NHTSA under the order. A NHTSA spokeswoman wouldn’t comment.

The agency ordered automakers and tech companies to report crashes involving driver-assist systems, as well as fully autonomous driving systems.

In defending its partially automated systems, Tesla has said that Autopilot and “Full Self-Driving” cannot drive themselves, and that drivers should be ready to intervene at all times. The systems can keep cars in their lanes and away from other vehicles and objects. But in documents released last week, NHTSA raised questions about whether human drivers can intervene fast enough to prevent crashes.

Tesla’s “Full Self-Driving” is designed to complete a route on its own with human supervision, with the eventual aim of driving itself and running a fleet of autonomous robo-taxis. In 2019, Musk had pledged to have the robo-taxis running in 2020.

Tesla’s Autopilot driver-assist system detects hands on the steering wheel to make sure drivers are paying attention. But that’s inadequate, Rajkumar said. By contrast, systems such as GM’s monitor a driver’s eyes with a camera, he said, to make sure they’re looking forward.

By participating in online discussions you acknowledge that you have agreed to the Terms of Service. An insightful discussion of ideas and viewpoints is encouraged, but comments must be civil and in good taste, with no personal attacks. If your comments are inappropriate, you may be banned from posting. Report comments if you believe they do not follow our guidelines. Having trouble with comments? Learn more here.