Study: Autonomous vehicles won't make roads completely safe

07 June, 2020
Study: Autonomous vehicles won't make roads completely safe
A fresh study says that while autonomous vehicle technology has great promise to reduce crashes, it might not exactly be able to prevent all mishaps due to human error.

Auto safety authorities say humans cause about 94% of U.S. crashes, however the Insurance Institute for Highway Safety study says computer-controlled robocars is only going to stop about one-third of these.

The group says that while autonomous vehicles eventually will identify hazards and react faster than humans, plus they won’t become distracted or drive drunk, stopping all of those other crashes will be a lot harder.

“We're still likely to see some issues whether or not autonomous vehicles might react quicker than humans do. They'll not always be able to react instantaneously," said Jessica Cicchino, and institute vice president of research and co-author of the analysis.

The IIHS studied over 5,000 crashes with detailed causes which were collected by the National Highway Traffic Safety Administration, separating out those due to “sensing and perceiving” errors such as driver distraction, impaired visibility or failing to spot hazards until it was too late. Researchers also separated crashes due to human “incapacitation” including drivers impaired by alcohol or drugs, those who fell asleep or drivers with medical problems. Self-driving vehicles can prevent those, the analysis found.

However, the robocars might not exactly be able to avoid the rest, including prediction errors such as for example misjudging how fast another vehicle is traveling, planning errors including driving too fast for road conditions and execution errors including incorrect evasive maneuvers or other mistakes controlling vehicles.

For example, if a cyclist or another vehicle suddenly veers in to the path of an autonomous vehicle, it might not exactly manage to stop fast enough or steer away with time, Cicchino said. “Autonomous vehicles need to not merely perceive the world around them perfectly, they have to react to what's around them aswell,” she said.

How many crashes are prevented depends a lot how autonomous vehicles are programmed, Cicchino said. More crashes will be stopped if the robocars obey all traffic laws including speed limits. But if artificial intelligence allows them to operate a vehicle and react more like humans, then fewer crashes will be stopped, she said.

“Building self-driving cars that drive as well as people do is a large challenge alone,” IIHS Research Scientist Alexandra Mueller said in a statement. “But they'd actually need to be better than that to deliver on the promises we've all heard.”

Partners for Automated Vehicle Education, a group with many self-driving vehicle companies as members, said Thursday that the analysis incorrectly assumes superior perception and insufficient distraction will be the only ways autonomous vehicles can drive better than humans.

Autonomous vehicles, for example, can be programmed never to break traffic laws, that your study blames for 38% of crashes. “The assumption these behaviors could possibly be altered by passengers with techniques that so dramatically reduce safety is inconsistent using what our members tell us about the culture they bring to AV development,” said a statement from the group, which include Ford, General Motors, Waymo, Lyft, Daimler, Volkswagen and others.

Study numbers show autonomous vehicles would prevent 72% or crashes, the group said, however the vehicles are so complex that the best impact is merely a guess.

Yet Missy Cummings, a robotics and human factors professor at Duke University who's familiar with the analysis, said preventing even one-third of the human-caused crashes is giving technology too much credit. Even vehicles with laser, radar and camera sensors don’t always perform flawlessly in every conditions, she said.

“There is a probability that even though all three sensor systems come to bear, that obstacles could be missed,” Cummings said. “No driverless car company has had the opportunity to do that reliably. They understand that, too.”

Researchers and persons in the autonomous vehicle business never thought the technology would be capable of protecting against all crashes now caused by humans, she said, calling that “layman’s conventional wisdom that somehow this technology will probably be a panacea that will prevent all death.”

IIHS researchers reviewed the crash causes and decided which types could be prevented, let's assume that all vehicles on the road were autonomous, Cicchino said. Even fewer crashes will be prevented while self-driving vehicles are blended with human driven cars, she said.

Virginia-based IIHS is a nonprofit research and education organization that’s funded by automobile insurance companies.

A lot more than 60 companies have put on test autonomous vehicles in California alone, nonetheless they have yet to get started on a fully-robotic large-scale ride-hailing service without human backup drivers.

Several companies including Alphabet Inc.‘s Waymo and General Motors’ Cruise had pledged to accomplish it in the past 2 yrs, but those plans were delayed when the industry pulled back after an Uber automated test vehicle hit and killed a pedestrian in March 2018 in Tempe, Arizona.

Tesla Inc. CEO Elon Musk this past year promised a fleet of autonomous robotaxis would start operating in 2020. But recently he has said he hopes to deploy the machine with humans monitoring it in early 2021, according to regulatory approval.
Source: japantoday.com
TAG(s):
Search - Nextnews24.com
Share On:
Nextnews24 - Archive