BlankI question their conclusion somewhat, they state that one of the scenarios
is if
another vehicle veers into the path of an autonomous vehicle; but then later,
they
claim that they've reached this conclusion even assuming all vehicles were
autonomous. In my mind, if all vehicles were autonomous, except for some
system
failure -- which is unlikely -- there shouldn't be an issue of a vehicle
veering into
the path of another.
I would also think that sensors, or streaming weather and traffic conditions
could be
built into the software to allow it to adjust to verying road conditions.
Steve
Study: Autonomous vehicles won't make roads completely safe.
DETROIT -- A new study says that while autonomous vehicle technology has great
promise to reduce crashes, it may not be able to prevent all mishaps caused by
human
error.
Auto safety experts say humans cause about 94% of U.S. crashes, but the
Insurance
Institute for Highway Safety study says computer-controlled robocars will only
stop
about one-third of them. The group says that while autonomous vehicles
eventually
will identify hazards and react faster than humans, and they won't become
distracted
or drive drunk, stopping the rest of the crashes will be a lot harder.
"We're still going to see some issues even if autonomous vehicles might react
more
quickly than humans do. They're not going to always be able to react
instantaneously," said Jessica Cicchino, and institute vice president of
research and
co-author of the study.
The IIHS studied over 5,000 crashes with detailed causes that were collected by
the
National Highway Traffic Safety Administration, separating out those caused by
sensing and perceiving errors such as driver distraction, impaired visibility
or
failing to spot hazards until it was too late. Researchers also separated
crashes
caused by human incapacitation including drivers impaired by alcohol or drugs,
those
who fell asleep or drivers with medical problems.
Self-driving vehicles can prevent those, the study found. However, the robocars
may
not be able to prevent the rest, including prediction errors such as misjudging
how
fast another vehicle is traveling, planning errors including driving too fast
for
road conditions and execution errors including incorrect evasive maneuvers or
other
mistakes controlling vehicles.
"For example, if a cyclist or another vehicle suddenly veers into the path of
an
autonomous vehicle, it may not be able to stop fast enough or steer away in
time,"
Cicchino said. "Autonomous vehicles need to not only perceive the world around
them
perfectly, they need to respond to what's around them as well," she said.
Just how many crashes are prevented depends a lot on how autonomous vehicles
are
programmed, Cicchino said. More crashes would be stopped if the robocars obey
all
traffic laws including speed limits. But if artificial intelligence allows them
to
drive and react more like humans, then fewer crashes will be stopped, she said.
Building self-driving cars that drive as well as people do is a big challenge
in
itself, IIHS Research Scientist Alexandra Mueller said in a statement. "But
they'd
actually need to be better than that to deliver on the promises we've all
heard."
Missy Cummings, a robotics and human factors professor at Duke University who
is
familiar with the study, said "preventing even one-third of the human-caused
crashes
is giving technology too much credit." "Even vehicles with laser, radar and
camera
sensors don't always perform flawlessly in all conditions," she said. "There is
a
probability that even when all three sensor systems come to bear, that
obstacles can
be missed," Cummings said. "No driverless car company has been able to do that
reliably. They know that too."
Researchers and people in the autonomous vehicle business never thought the
technology would be capable of preventing all crashes now caused by humans, she
said,
calling that layman's conventional wisdom that somehow this technology is going
to be
a panacea that is going to prevent all death.
IIHS researchers reviewed the crash causes and decided which ones could be
prevented,
assuming that all vehicles on the road were autonomous, Cicchino said.
Even fewer crashes will be prevented while self-driving vehicles are mixed with
human
driven cars, she said.
Virginia-based IIHS is a nonprofit research and education organization that's
funded
by auto insurance companies.
More than 60 companies have applied to test autonomous vehicles in California
U.S.
alone, but they have yet to start a fully-robotic large-scale ride-hailing
service
without human backup drivers. Several companies including Alphabet Inc.'s Waymo
and
General Motors' GM Cruise had pledged to do it during the past two years, but
those
plans were delayed when the industry pulled back after an Uber automated test
vehicle
hit and killed a pedestrian in March 2018 in Tempe, Arizona. Companies are
continuing
to update and hone their autonomous vehicle systems. Tesla Inc. CEO Elon Musk
last
year promised a fleet of autonomous robotaxis would start operating in 2020.
But
recently he has said he hopes to deploy the system.