Pinboard (jm)
https://pinboard.in/u:jm/public/
recent bookmarks from jmThese stickers make AI hallucinate things that aren’t there - The Verge2018-01-04T10:47:12+00:00
https://www.theverge.com/2018/1/3/16844842/ai-computer-vision-trick-adversarial-patches-google
jmThe sticker “allows attackers to create a physical-world attack without prior knowledge of the lighting conditions, camera angle, type of classifier being attacked, or even the other items within the scene.” So, after such an image is generated, it could be “distributed across the Internet for other attackers to print out and use.”
This is why many AI researchers are worried about how these methods might be used to attack systems like self-driving cars. Imagine a little patch you can stick onto the side of the motorway that makes your sedan think it sees a stop sign, or a sticker that stops you from being identified up by AI surveillance systems. “Even if humans are able to notice these patches, they may not understand the intent [and] instead view it as a form of art,” the researchers write.
]]>self-driving cars ai adversarial-classification security stickers hacks vision surveillance classificationhttps://pinboard.in/https://pinboard.in/u:jm/b:73ea74cc0eee/