Security specialists trick Tesla’s Computer Vision algorithm with a drone and a projector

A new video and a research paper have emerged from Georgia Tech and Ben-Gurion University of the Negev suggesting that researchers have been able to trick Tesla’s Autopilot system with a cheap ‘off the shelf’ projector strapped to a drone.

This sort of attack isn’t new, but it does highlight how attacks could change in the future. You might remember a similar 2018 Princeton & Purdue paper (click here to access) which deceives an autonomous Tesla P100 using fake traffic signs. Their research revealed that the gap in the technology allows “…an adversary to stealthily embed a potentially dangerous traffic sign into an innocuous one.”

This new research was again able to trick the computer vision / machine learning algorithm to change the car’s speed by projecting a phantom road sign and even a fake pedestrian highlighting some significant gaps within this technology when it comes to security.  

The research was conducted by the University’s cyber security centre to identify some of the serious gaps that exist by creating ‘perceptual change’ – a phrase coined by the researchers within their paper which you can access by clicking here.

The research demonstrates “…how attackers can exploit this perceptual challenge to apply phantom attacks…without the need to physically approach the attack scene” they continue “…our experiments show that when presented with various phantoms, a car’s ADAS or autopilot considers the phantoms as real objects, causing these systems to trigger the brakes, steer into the lane of oncoming traffic, and issue notifications about fake road signs.”

The experiment was conducted on Tesla’s Autopilot (model X), as well as Mobileye 630 PRO – another advanced computer vision / machine learning system used to enhance the safety of drivers in cars like the Mazda 3. As you can see in the video below, the researchers project an image of a vehicle, create false highway speed signs, and worst of all fake road markings – all of which cause the automated system to react and potentially cause a hazard for other road users.

With the implementation of 5G, the researchers both suggest that creating a communication protocol that allows autonomous vehicles to ‘double check’ what they’re ‘seeing’  could help resolve many of the issues around fake highway signs or road markings.

However, the use of projector technology with a drone to create the impression of phantom objects in the road that can influence or modify the behaviour of an autonomous vehicle clearly demonstrates how far this technology needs to go before we can truly take our eyes off the road whilst driving.

Written by Zack Raja: z.raja@talentometry.co.uk | 0161 790 9872
LinkedIn / Twitter
Click here for deep learning / computer vision jobs

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.