GhostStripe attack haunts self-driving cars by making them ignore road signs

  • 📰 TheRegister
  • ⏱ Reading Time:
  • 40 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 19%
  • Publisher: 61%

Car Car Headlines News

Car Car Latest News,Car Car Headlines

Cameras tested are specced for Baidu's Apollo

Six boffins mostly hailing from Singapore-based universities say they can prove it's possible to interfere with autonomous vehicles by exploiting the machines' reliance on camera-based computer vision and cause them to not recognize road signs.

The GhostStripe paper's illustration of the 'invisible' adversarial attack against a self-driving car's traffic sign recognition The team developed two versions of this stablized attack. The first was GhostStripe1, which does not require access to the vehicle, we're told. It employs a tracking system to monitor the target vehicle's real-time location and dynamically adjusts the LED flickering accordingly to ensure a sign isn't read properly.Researchers claim Windows Defender can be fooled into deleting databases

The team tested their system out on a real road and car equipped with a Leopard Imaging AR023ZWDR, the camera used in Baidu Apollo's hardware reference design. They tested the setup on stop, yield, and speed limit signs.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 67. in CAR

Car Car Latest News, Car Car Headlines