If you’re wondering why autonomous cars get into fender benders during tests, it may be because whereas robots tend to be sticklers for the law, humans don’t.

It’s why developers are closely monitoring how autonomous cars interact with human drivers on the road, in preparation for a future where we will all have to co-exist.

“They don’t drive like people. They drive like robots,” said Mike Ramsey, an analyst at Gartner Inc, who specializes in advanced automotive technologies. “They’re odd and that’s why they get hit.”

Still, as Autonews points out, while sharing the road with a driverless car might make you a bit weary, in reality, those cars will prove to be overly cautious and obey traffic regulations at all times, something human drivers don’t really do. Not always.

“If the cars drive in a way that’s really distinct from the way that every other motorist on the road is driving, there will be in the worst case accidents and in the best case frustration,” says NuTonomy CEO, Karl Iagnemma. “What that’s going to lead to is a lower likelihood that the public is going to accept the technology.”

According to Iagnemma, what happens is that autonomous cars struggle to translate visual cues into predictions, which humans tend to do with ease, based on their personal experiences with fellow human drivers. In California alone, since the beginning of 2016, autonomous vehicles have been rear-ended 13 times out of 31 total collisions – which almost always occur at intersections.

“You put a car on the road which may be driving by the letter of the law, but compared to the surrounding road users, it’s acting very conservatively,” added Iagnemma. “This can lead to situations where the autonomous car is a bit of a fish out of water.”

With that in mind, it’s interesting to see how Waymo is trying to approach such issues, refining the way its vehicles act so that they appear more “natural”. For example, the developer altered the software dictating to the cars handled turns in order to make it more comfortable for passengers.

“They were cutting the corners really close, closer than humans would,” said Duke University robotics professor Missy Cummings. “We typically take wider turns.”

Another interesting thing Waymo is doing is teaching their cars to inch forward at flashing yellow lights.

“Humans violate the rules in a safe and principled way, and the reality is that autonomous vehicles in the future may have to do the same thing if they don’t want to be the source of bottlenecks,” concluded Iagnemma.

It then seems we’re inevitably headed towards a future where both ourselves as well as robot drivers need to be aware of each other’s tendencies and drive in a similar fashion.

Video via Fresco News / YouTube