Changing the Street Signs by pasting a sticker on them is confusing self driving cars

Deep neural networks set out to be pretty good at sorting images, but it’s still worth getting that the method is quite unlike the way humans recognize images, even if the end results are fairly comparable. we have recalled of that once again this day when reading about a technique of spoofing road signs. It’s a way that just looks like street art to you or me, but it totally changes the definition of a stop sign to the computer reading it.

No real self-driving cars were wrecked in this study

First off, it’s critical to note that the writing is a proof-of-concept; no actual automotive-grade machine vision methods were used in the test. Covering your neighborhood stop signs in strips of black and white tape is not moving to lead to a sudden spate of car crashes today. Ivan Evtimov a graduate senior at the University of Washington and some colleagues first taught a deep neural network to identify different US road signs. Then, they built an algorithm that generated turns to the signs that human eyes find harmless, but which changed the definition when a sign was read by the AI classifier they just prepared.

Evtimov and his co-authors suggest two different ways to hack a road sign, either by writing out an altered copy that you cover the actual sign with or by just performing small additions with stickers. There’s also a selection of those alterations. One is to use complex perturbations that perform the sign look weathered to a rational observer. The other is to cover the changes so they look like street art: in this case, either little black and white stripes or blocky text reading LOVE and HATE.

The effects were pretty exciting. One test was able to make a stop sign to be probably misread as a speed limit sign, and another made a right turn sign to be classified as unless a stop or added lane sign. To repeat: these sorts of attacks worked on the specific computer vision system the researchers trained, and the changed signs in the public above would not fool any cars on the road today. But they do show the concept that this kind of spoofing would work, presented one had access to the instruction set and the operation they were attacking.

Take your time to comment on this article.

Related posts

Apple Addressed Two Zero-Day Flaws In Intel-based Macs

Really Simple Security Plugin Flaw Risks 4+ Million WordPress Websites

Glove Stealer Emerges A New Malware Threat For Browsers