Poltergeist Attack Targets Self-Driving Cars, Blinding Them Via Audio Signals

Researchers have found another way to disrupt  autonomous vehicles. This time, the strategy is to blind self-driving cars via the Poltergeist attack using audio signals.

Poltergeist Attack Blinding Self-Driving Cars

A team of researchers from the Ubiquitous System Security Lab (USSL), Zhejiang University, and the Security and Privacy Research Group (SPRG), University of Michigan, have come up with a new cyberattack that can compromise the intelligence of self-driving cars. As revealed, they have devised the “Poltergeist” attack that blinds self-driving cars with audio signals.

The attack basically exploits the vehicles’ imaging and object detection systems’ increased susceptibility to acoustic manipulation. An attacker can trick the vehicle’s system into perceiving the objects differently in the resultant images with deliberate actions. According to the researchers,

By emitting deliberately designed acoustic signals, an adversary can control the output of an inertial sensor, which triggers unnecessary motion compensation and results in a blurred image, even if the camera is stable.

In turn, this will affect the decision-making function of the car, which will further lead to devastating real-world consequences.

Specifically, the researchers devised three types of attacks targeting the vehicles. These include

  • Hiding attacks: an adversary would cause the object detector to fail to recognize an object.
  • Creating attacks: bluffing the target vehicle’s system into recognizing an object that does not physically exist.
  • Altering Attacks: altering a physically existent object in a way that the vehicle’s system misidentifies it as another object.

Whereas the researchers devised these attacks aiming at four different object detectors (YOLO V3/V4/V5 and Faster R-CNN) and one commercial detector (Apollo).

Besides, for real-world demonstration, the team targeted a Samsung S20 smartphone serving as the computer vision for an Audi Q3 car.

The researchers have shared multiple demonstration videos here, from which, one is shared below.

 

Recommended Countermeasures

Suggesting possible measures to prevent Poltergeist (PG) attacks, the researchers advise protecting the MEMS inertial sensors with microfibrous metallic fabric and a low-pass filter to fend off malicious acoustic signals.

Plus, enhancing image stabilizing techniques and improving the object detection algorithms might also help avoid PG attacks on self-driving cars.

The researchers have elaborated on their findings in a detailed research paper, alongside an explanation on GitHub.

Let us know your thoughts in the comments.

Related posts

Water Facilities Must Secure Exposed HMIs – Warns CISA

Microsoft December Patch Tuesday Arrived With 70+ Bug Fixes

NachoVPN Attack Risks Corporate VPN Clients