Self-driving cars can be brought to standstill - 9th Sep 2015 4:27pm
From the BT News pages -
Self-driving cars can be brought to standstill by laser pointers, say security researchers
A hack using cheap laser kits can create phantom cars and people which force self-driving cars to stop.
Security researchers have found a flaw that could allow anyone to fool a driverless car into stopping – simply by using a technique involving laser pointers.
Sensors on board autonomous vehicles allow it to detect any obstacles, but Security Innovation say that a laser kit costing under £40 can be used to create phantom objects that force the car to come to a halt.
“I can take echoes of a fake car and put them at any location I want,” Principal scientist Jonathan Petit told IEEE Spectrum. “And I can do the same with a pedestrian or a wall. If a self-driving car has poor inputs, it will make poor driving decisions.”
The simple hack – also known as ‘spoofing’ – messes with laser ranging lidar systems, which most self-driving cars rely on to generate an image of its surroundings. But Petit was able to create several fake obstacles that worked within 100m of the lidar unit.
“There are ways to solve it,” he continued. “A strong system that does misbehaviour detection could cross-check with other data and filter out those that aren’t plausible. But I don’t think car makers have done it yet. This might be a good wake-up call for them.”
Petit will present a paper on the hack at the Black Hat Europe security conference held in November.
Google, which has led the way on self-driving cars, has experienced several accidents since hitting the road – although all have been human error. In July, one of the firm’s Lexus SUV driverless cars was rear-ended in Google's home city of Mountain View, California.
It makes me wonder what would happen if the police point a laser speed gun at it?
Self-driving cars can be brought to standstill by laser pointers, say security researchers
A hack using cheap laser kits can create phantom cars and people which force self-driving cars to stop.
Security researchers have found a flaw that could allow anyone to fool a driverless car into stopping – simply by using a technique involving laser pointers.
Sensors on board autonomous vehicles allow it to detect any obstacles, but Security Innovation say that a laser kit costing under £40 can be used to create phantom objects that force the car to come to a halt.
“I can take echoes of a fake car and put them at any location I want,” Principal scientist Jonathan Petit told IEEE Spectrum. “And I can do the same with a pedestrian or a wall. If a self-driving car has poor inputs, it will make poor driving decisions.”
The simple hack – also known as ‘spoofing’ – messes with laser ranging lidar systems, which most self-driving cars rely on to generate an image of its surroundings. But Petit was able to create several fake obstacles that worked within 100m of the lidar unit.
“There are ways to solve it,” he continued. “A strong system that does misbehaviour detection could cross-check with other data and filter out those that aren’t plausible. But I don’t think car makers have done it yet. This might be a good wake-up call for them.”
Petit will present a paper on the hack at the Black Hat Europe security conference held in November.
Google, which has led the way on self-driving cars, has experienced several accidents since hitting the road – although all have been human error. In July, one of the firm’s Lexus SUV driverless cars was rear-ended in Google's home city of Mountain View, California.
It makes me wonder what would happen if the police point a laser speed gun at it?