Self-driving cars can be fooled by fake cars, pedestrians and other bogus signals

Google, Uber and even Apple’s potential self-driving car can all be foiled by little more than a homebrew laser pointer thanks to their Lidar sensor systems.

Jonathan Petit, principal scientist at software security company Security Innovation, has unearthed a gaping security vulnerability in Lidar sensors – essentially the eyes of any self-driving car. In a research paper due to be presented at the Black Hat Europe security conference in November, Petit outlines how a low-power laser and pulse generator can fool a car into believing that other cars and pedestrians are around it.

The vulnerability means that self-driving cars could be halted in the middle of the road when they believe another car or person has appeared in view suddenly. In a situation where self-driving cars and regular cars share the road, I need not point out the dangers of such a hack.

We’ve seen other connected car hacks, including one resulting in the recall of 1.4 million Jeeps, but Petit’s attack works by fooling a car, rather than exposing its weak security.

Having set out to expose the security issues around autonomous cars, Petit began by recording pulses from a commercial Ibeo Lux Lidar unit. Discovering the pulses weren’t encoded or encrypted, he could simply use them at a later time to fool the unit into believing objects were there when they weren’t. “The only tricky part was to be synchronised, to fire the signal back at the Lidar at the right time,” said Petit. “Then the Lidar thought that there was clearly an object there.”

“I can spoof thousands of objects and basically carry out a denial-of-service attack on the tracking system so it’s not able to track real objects.” According to IEEE, Petit’s attack worked from all sides from up to 100 metres away, and didn’t require him to accurately focus on the Lidar unit with his laser beam.

While Petit offers up a solution using “misbehaviour detection” to filter out implausible obstacles, he doesn’t believe carmakers have even thought of doing such a thing. There are currently no commercial autonomous cars on the road for the general public, so there’s still plenty of time for Google and other self-driving car manufacturers to introduce encryption methods.

Ultimately, this latest car hacking is yet another signal that industries usually disassociated from data security now need to start taking it seriously.

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.