Self-Driving Cars Still Have A Lot To Learn

A new report uncovers every incident self-driving cars faced in California last year, and they still have a lot to learn.

We may earn a commission from links on this page.
A man sat behind the wheel of a self-driving car.
Self-driving cars give you time to enjoy the passing scenery, like that excellent VW bus.
Photo: Yuri Kadobnov / Contributor (Getty Images)

Every year, we’re told that this is the year we’ll get self-driving cars. And every year, come December, those autonomous vehicles fail to materialize anywhere beyond the research facility. But now, a report from California has outlined some of the things self-driving cars still need to learn before they can be let loose on the public.

In The Golden State, some of the companies pioneering self-driving cars have been testing their autonomous vehicles out on the streets. Firms like Cruise, Waymo and Apple have all sent fleets of autonomous vehicles out onto the roads in California to test their mettle.

Advertisement

And now, the California Department of Motor Vehicles has published a report outlining every issue these self-driving cars faced in 2021.

Advertisement

The DMV has strict rules for anyone testing self-driving cars in the state. As such, every time a test vehicle is out on the road and a driver has to take over for any reason, the incident must be logged. At the end of the year, these incidents are all compiled in the Disengagement Report, which includes more than 2,500 incidents from the past 12 months.

Advertisement
A photo of a Waymo logo on a car door.
There’s still a Waymo to go before autonomous cars can hit the mainstream.
Photo: Glenn Chapman / Contributor (Getty Images)

The Disengagement Report shows that there are 25 companies licensed to test their autonomous cars on the streets of California. OEMs like Toyota, Mercedes and Nissan are on this list, while including tech firms like Qualcomm and NVIDIA.

Advertisement

There’s a Tesla-shaped hole in this list, though, as it prefers to let its customers test out its latest level 2 driver-assist systems, with fairly troubling results at times.

But no matter what company it is, each autonomous vehicle tester in California’s report seems to be encountering similar issues – all following the three Ps: perception, prediction and planning.

Advertisement

Object perception is about what the software driving the autonomous car thinks is in the road ahead. So the issues self-driving cars faced in this regard are all about when a car mistook an object for something else, like a red traffic light for a green one.

Everything from “small objects in the road” to “incorrectly perceived rain” lead to unwanted braking. Or, at times, the cars were also late to apply the brakes. In one test, a self driving car was “late to perceive” an animal crossing the road and the test driver had to slam on the anchors.

Advertisement
A photo of a sensor array on a self-driving car.
Do you ever get a sense that you’re being watched?
Photo: David McNew / Stringer (Getty Images)

Then there are the prediction issues, which are all about the way self-driving cars can “guess” how the objects they observe will behave. As such, the times test drivers were forced to step in came about when the cars couldn’t correctly predict how pedestrians would behave, how other cars in traffic would act or that a parked car won’t move. In each instance, incorrect predictions about these objects caused an “undesirable motion plan” and forced the test driver to take over.

Advertisement

Then there are the planning issues. Rather than the behaviors of various objects, these are directly related to other road users, such as other cars, trucks, pedestrians crossing the road, or even cyclists.

So here, it’s all about how the car plans to react to vehicles changing lanes on a highway, trucks making wide turns, or pedestrians “making illegal crossings.”

Advertisement
A photo of how a computer perceives objects on the road.
Here’s how a self-driving car sees the world.
Photo: Glenn Chapman / Staff (Getty Images)

Away from the three Ps, self-driving cars also had issues maintaining the correct speed on various roads. Test drivers reported taking the wheel when the self-driving car was following the speed limit, but was said to be driving “too slow or too fast given the traffic and road conditions.”

Advertisement

There’s also the whole “map discrepancy” issue, which seemingly only affects Apple-operated vehicles. I guess that’s just more Apple Maps woes, which is something we’ll all have to learn to live with.

Then, there are also a lot of general hardware issues.

Sometimes, drivers were forced to take the wheel when data recorders failed, if certain components went offline or if a software glitch asked for the test driver to take over. Some companies also reported “precautionary” takeovers when they approached pedestrians, traffic signals or certain stopped vehicles. And finally, there are all the times that test drivers were forced to take the wheel when they encountered a “recklessly behaving road user.” Because, of course, you can program an autonomous car to follow the rules of the road, but you’ll sadly never get some people to do the same.