Privacy Tool 'Cloaks' Faces to Trick Facial Recognition Software

The 'Fawkes' method was developed by researchers at the University of Chicago’s Sand Lab.
Chris Young

Ever since the Cambridge Analytica scandal broke in 2018, the use of private individual data by large corporations has been a global issue of top concern.

Last year, a company even developed a website that allows users to avoid booking with airlines that are unclear about how they use passenger facial recognition, or biometric, data.

Now, as part of an increasingly desperate bid to retain our privacy in this digital age, researchers have at the University of Chicago’s Sand Lab have developed a tool that can "cloak" our images by slightly — almost imperceptible — altering them so they can't be picked up by facial recognition software.


A vendetta against facial recognition firms

The tool, dubbed 'Fawkes', was named after the Guy Fawkes masks worn by groups like Anonymous and inspired by the Alan Moore graphic novel V for Vendetta. Fawkes uses artificial intelligence (AI) to very slightly change tiny features on your face.

To a casual observer, there will be no difference. To facial recognition software, it can completely remove its main function, which is, of course, to recognize an individual and match them to their data.

Privacy Tool 'Cloaks' Faces to Trick Facial Recognition Software
Original images compared with their cloaked versions, Source: SAND Lab, University of Chicago

The Fawkes tool could prove to be particularly useful for social media users. Earlier this year, The New York Times reported that facial recognition firm Clearview AI collected up to three billion images from sites like Facebook and YouTube without users' express consent in order to test its software.

Fawkes, in effect, could give social media users a layer of protection against such invasive practices, the researchers say. The tool 'cloaks' the user's face by adding an invisible layer, making facial recognition software believe the cloaked face is a completely different person when compared to an original image.

Privacy Tool 'Cloaks' Faces to Trick Facial Recognition Software
A 'cloaked' Barack Obama image (right) compared with the original image (left), SourceSAND Lab, University of Chicago

“What we are doing is using the cloaked photo in essence like a Trojan Horse, to corrupt unauthorized models to learn the wrong thing about what makes you look like you and not someone else,” Ben Zhao, a professor of computer science at the University of Chicago who worked on the Fawkes software, explained to The Verge in an interview.

"Once the corruption happens, you are continuously protected no matter where you go or are seen," he continued.

A 100% success rate

According to the team from the University of Chicago, Fawkes has shown a 100 percent success rate against some of the most advanced facial recognition services, including Microsoft's Azure Face, Amazon's Rekognition, and Face++ by Chinese tech company Megvii.

The team behind Fawkes published a paper on their Fawkes algorithm earlier this year. Late last month, however, they released Fawkes as a free software for Windows and Mac. They report that it has already been downloaded more than 100,000 times.

Despite everything else going on in the world, it seems that online privacy is still an important concern for people the world over.

Add Interesting Engineering to your Google News feed.
Add Interesting Engineering to your Google News feed.
message circleSHOW COMMENT (1)chevron
Job Board