Bots Rule, Humans Drool: Impatient Humans to Blame in Self-Driving Car Accidents

© AP Photo / Eric RisbergA Google self-driving car goes on a test drive near the Computer History Museum in Mountain View, Calif. While these cars promise to be much safer, accidents will be inevitable. How those cars should react when faced with a series of bad, perhaps deadly, options is a field even less developed than the technology itself.
A Google self-driving car goes on a test drive near the Computer History Museum in Mountain View, Calif. While these cars promise to be much safer, accidents will be inevitable. How those cars should react when faced with a series of bad, perhaps deadly, options is a field even less developed than the technology itself. - Sputnik International
Subscribe
Turns out that stopping at stop signs and even going the posted speed limit is not the best way to drive in this day and age, a report shows.

An American flag is reflected in the grill of a Tesla Model S - Sputnik International
Tesla to Present Self-Driving Semi Truck on October 26 - CEO
According to accident reports submitted to the California Department of Motor Vehicles, a whopping 15 out of 31 autonomous car collisions since 2016 were caused by impatient human drivers.

"[Self-driving cars] don't drive like people, they drive like robots," Mike Ramsey, an analyst who specializes in advanced automotive technology at Gartner Inc, told Bloomberg. "They're odd and that's why they get hit."

Per Bloomberg, accidents happened mostly at intersections when self-driving vehicles were waiting to make a right turn into oncoming traffic. The outlet noted that Waymo's now-retired "Firefly" car was rear-ended twice in two separate occasions when it stopped to yield before making the right turn.

Another incident saw a robotic vehicle get clipped by a truck trying to get past the latter's grandma-like driving skills, otherwise known as going the speed limit.

"You put a car on the road which may be driving by the letter of the law, but compared to the surrounding road users, it's acting very conservatively," explained Karl Iagnemma, CEO of NuTonomy, a self-driving software developer. "This can lead to situations where the autonomous car is a bit of a fish out of water."

A self-driven Volvo SUV owned and operated by Uber Technologies Inc. is flipped on its side after a collision in Tempe, Arizona, U.S. on March 24, 2017. - Sputnik International
Back At It: Uber Resumes Self-Driving Car Program After Weekend Crash
Wanting to solve the issue, the techies at Waymo put on their thinking caps and came up with a few solutions that included having the cars act more "natural" by making wider turns and teaching the vehicle to inch forward at flashing yellow lights, the outlet reported.

"They were cutting the corners really close, closer than humans would," Missy Cummings, a robotics professor at Duke University, said. "We typically take wider turns."

Working alongside the Virginia Tech Transportation Institute, the Ford Motor Company went as far as conducting an experiment to see how autonomous cars could communicate with humans via light signals.

"Humans violate the rules in a safe and principled way, and the reality is that autonomous vehicles in the future may have to do the same thing if they don't want to be the source of bottlenecks," Iagnemma added.

Though California is right now the only state requiring collision reports when an autonomous car is struck, it's a safe bet that at least some humans need to dust off their old manuals from driving school.

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала