Annoyed at being automatically tagged with Facebook’s facial-recognition system? Wearing a pair of tie-dye-looking glasses could help. Carnegie Mellon University researchers recently conducted a study that concluded the right pair of glasses could trick facial-recognition software into thinking you are someone else. In fact, the glasses that cost less than a quarter to make successfully concealed the wearer’s identity in 100 percent of the tests.
While machine learning has allowed computers to easily recognize faces, computers don’t look at a picture of a face the same way we do — where humans see facial features like eyes and beauty marks, computers still see just pixels. Facial-recognition software uses measurements of the facial features — the researchers theorized that the right patterns on a pair of glasses could throw off those measurements.
Using the patterns that facial-recognition systems detect, the researchers printed the colorful patterns on a pair of glasses. Using a wide-rim style allowed the spectacles to take up about 6.5 percent of the pixels in the images tested.
The glasses had a 100 percent success rate at allowing the wearer to fly under the radar, but in some instances could even steal someone else’s identity. The software matched one of the spectacle-clad researchers, a 41-year-old white male, to actress Milla Jovovich — and the computer was 87.87 percent confident in the accuracy of that match.
The researchers’ intent was to find just how easy it is to fool the facial recognition systems inside security cameras, while still being inconspicuous. Facial recognition software will recognize when a person is wearing sunglasses and leave the eye area out of the calculation. The right makeup and even LED lights can also fool the camera, but it is a bit more obvious.
While the brightly colored glasses may not completely fall under inconspicuous, they are not the ski-mask-in-July instant alarm either. And since the patterns were simply printed onto the glasses, crafting a pair of camera-fooling shades wouldn’t be too tough for someone looking to trick security cameras.
“As our reliance on technology increases, we sometimes forget that it can fail,” the study concluded. “In some cases, failure may be devastating and risk lives. Our work and previous work show that the introduction of [machine learning] to systems while bringing benefits, increases the attack surface of those systems. Therefore, we should advocate for the integration of only those [machine learning] algorithms that are robust against evasion.”
Additional research needs to be conducted, the study noted, including using the eyeglasses with additional variables such as the distance between the wearer and the camera and using different lighting conditions.