Six boffins mostly hailing from Singapore-based universities say they can prove it's possible to interfere with autonomous vehicles by exploiting the machines' reliance on camera-based computer vision and cause them to not recognize road signs.
The GhostStripe paper's illustration of the 'invisible' adversarial attack against a self-driving car's traffic sign recognition The team developed two versions of this stablized attack. The first was GhostStripe1, which does not require access to the vehicle, we're told. It employs a tracking system to monitor the target vehicle's real-time location and dynamically adjusts the LED flickering accordingly to ensure a sign isn't read properly.Researchers claim Windows Defender can be fooled into deleting databases
The team tested their system out on a real road and car equipped with a Leopard Imaging AR023ZWDR, the camera used in Baidu Apollo's hardware reference design. They tested the setup on stop, yield, and speed limit signs.