Skip to content

OPLIN 4Cast #710: Selfies, Guy Fawkes, and Data Poisoning

Posted in 4cast, and facial recognition

Last updated on November 24, 2020

OK, we talk about facial recognition a lot here on the 4cast. We wouldn’t blame you if you suspected we were a little obsessed. But when I caught a random news article about how researchers at University of Chicago have developed a tool—called “Fawkes” in honor of the Guy Fawkes masks sported by protesters—which disrupts the systems that build facial recognition systems, I was intrigued, and it led me into the larger field of data poisoning.

  • Image-scaling attacks highlight dangers of adversarial machine learning [TechTalks] “The way these AI algorithms learn to tell the difference between different objects is different from how human vision works. Most adversarial attacks exploit this difference to create small modifications that remain imperceptible to the human eye while changing the output of the machine learning system.”
  • Fawkes protects your identity from facial recognition systems, pixel by pixel [ZDNet] “The Fawkes system is a form of data poisoning. The aim is to post photos which, once scraped by a machine learning service, teach the model the wrong features and misdirect them in what makes a subject unique.”
  • This Tool Could Protect Your Photos From Facial Recognition [NY Times] “Ideally, people would start cloaking all the images they uploaded. That would mean a company like Clearview that scrapes those photos wouldn’t be able to create a functioning database, because an unidentified photo of you from the real world wouldn’t match the template of you that Clearview would have built over time from your online photos.”
  • ‘Cloaking’ is the new biz of hiding from facial recognition [The Hustle] “But they might be too late. Clearview AI has already copied billions of photos off the internet. And its CEO says Fawkes could improve Clearview’s tech.”

From the Ohio Web Library:

Share