Skip to main content

‘Adversarial glasses’ can fool even state-of-the-art facial-recognition tech

facial recognition glasses security gettyimages 866481488
Wonry/Getty Images
You may have heard about so-called “adversarial” objects that are capable of baffling facial recognition systems, either making them fail to recognize an object completely or prompting them to classify it incorrectly — for example, thinking that a rifle is actually a 3D-printed toy turtle. Well, researchers at Carnegie Mellon University and the University of North Carolina at Chapel Hill have just found a practical, scaleable, and somewhat scary application — anti-facial-recognition glasses.

Building on previous work by the same group from 2016, the researchers built five pairs of adversarial glasses, which can be successfully used by 90 percent of the population, making them a nearly “universal” solution. When worn, the glasses render wearers undetectable (or, as the researchers describe it, “facilitate misclassification”) even when viewed by the latest machine intelligence facial recognition tech. And far from looking like the kind of goofy disguises individuals might have worn to avoid being recognized in the past, these eyeglasses also appear completely normal to other people.

Recommended Videos

The eyeglasses were tested successfully against VGG and OpenFace deep neural network-based systems. Although the instructions for building them have not been made publicly available, the researchers say that the glasses could be 3D-printed by users.

Facial recognition has no problem identifying the Owen Wilson on the left. The one on the right? Not so much. Image used with permission by copyright holder

Whether the technology is good or bad depends largely on how you perceive facial recognition. On the one hand, it’s easy to see how privacy advocates would be excited at the prospect of glasses that can help bypass our surveillance society, in which we’re not only photographed 70 times per day, but can also be readily identified through facial recognition. (There are already examples of similar facial recognition disguises available on the market.)

On the other hand, facial recognition is frequently used to keep citizens safe by identifying potentially dangerous individuals in places like airports. For this reason, the researchers have passed on their findings to the Transportation Security Administration (TSA), and recommended that the TSA consider asking passengers to remove seemingly innocuous items like glasses and jewelry in the future, since these “physically realizable attack artifacts” could be used to beat even state-of-the-art recognition systems.

A paper describing the researchers’ work was recently published online, titled “Adversarial Generative Nets: Neural Network Attacks on State-of-the-Art Face Recognition.”

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Amazon bans police from using facial recognition tech Rekognition for 1 year
Amazon Logo

Amazon has barred police from using its facial recognition technology for one year.

In a company blog post Wednesday, Amazon said it will implement a one-year "moratorium on police use of Amazon’s facial recognition technology" -- known as Rekognition.

Read more
IBM will no longer develop or research facial recognition tech
IBM's Summit Supercomputer

IBM CEO Arvind Krishna says the company will no longer develop or offer general-purpose facial recognition or analysis software. In a June 8 letter addressed to Congress and written in support of the Justice in Policing Act of 2020, Krishna advocates for new reforms that support the responsible use of technology -- and combat systematic racial injustice and police misconduct.

“IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency,” wrote Krishna in the letter.

Read more
‘Dazzle’ makeup won’t trick facial recognition. Here’s what experts say will
martymoment CV dazzle

As demonstrators protest against racism and police brutality, some have suggested that extravagant makeup can block facial recognition technology they worry have been deployed by authorities.

But the creator of this “CV Dazzle” makeup style said the patterns, which were designed to fool an older method of facial detection, won't trick more sophisticated algorithms — though he and other experts said protesters can take steps to evade detection.

Read more