- Facial recognition technology is everywhere: More than half of Americans’ faces are now logged in police databases.
- To push back against surveillance, designers have invented clothes and accessories that make your face undetectable. The accessories combine fashion and technology, and can trick algorithms meant to detect and identify faces.
- The designs have been used by protesters aiming to avoid police surveillance in places like Hong Kong and the US, but they aren’t fail-proof – some new facial recognition algorithms are being developed to see past the visual tricks.
- Visit Business Insider’s homepage for more stories.
Smile! You’re on camera – or you were at some point in the past few years – and now your face is public domain.
Facial recognition technology is everywhere, and only becoming more pervasive. It’s marketed as a security feature by companies like Apple and Google to prevent strangers from unlocking your iPhone or front door.
It’s also used by government agencies like police departments. More than half of adult Americans’ faces are logged in police databases, according to a study by Georgetown researchers. Facial recognition technology is used by governments across the globe to identify and track dissidents, and has been deployed by police against Hong Kong protesters.
To push back, privacy-focused designers, academics, and activists have designed wearable accessories and clothes meant to thwart facial recognition tech. Demonstrators from Hong Kong to the US have used the masks to remain anonymous at protests, and encrypted messaging app Signal has even started distributing free anti-facial recognition masks to George Floyd protesters.
Facial recognition software uses artificial intelligence to detect faces or human figures in real-time. But that software is fallible – clothing can “dazzle” the software with misleading shapes that stop the AI from knowing what it’s looking at. Other designs confuse AI with images of decoy faces, preventing it from making the right identification.
These designs are still niche, and have mostly only appeared as art installations or academic projects. But as facial recognition becomes more widespread, they may catch on as the next trend in functional fashion.
It should be noted, however, that the anti-facial recognition designs are not failproof, and some algorithms are already being developed to overcome them.
Here are the ingenious, bizarre designs meant to outsmart facial recognition tech.
A lens-shaped mask makes its user undetectable to facial recognition algorithms while still allowing humans to read facial expressions and identity.
The “surveillance exclusion” mask was designed by Jip van Leeuwenstein while he was a student of Utrecht School of the Arts in the Netherlands.
The mask’s curvature blocks facial recognition from all angles.
“Because of its transparency you will not lose your identity and facial expressions,” von Leeuwenstein writes, “so it’s still possible to interact with the people around you.”
A Dutch design student invented a projector that superimposes an image of a different face over that of the wearer.
Jing-cai Liu created the wearable face projector: A “small beamer projects a different appearance on your face, giving you a completely new appearance.”
The device shifts rapidly between faces being projected, making detection even more difficult.
Images of Liu’s face projector went viral last year after misleading tweets claimed it was being used by protesters in Hong Kong. This claim was later debunked.
A Japanese college professor designed goggles fitted with LEDs that thwart facial recognition.
Isao Echizen, a professor at the National Institute of Informatics in Tokyo, designed the “privacy visor” as a safeguard against security cameras that could log someone’s face without their permission.
Images from Echizen’s lab shows how the visor blocks AI’s ability to detect a face.
The device is fitted with “a near-infrared LED that appends noise to photographed images without affecting human visibility.”
When switched on, a user’s face no longer scans as a human face to the AI, indicated by the lack of green boxes above.
A German designer created a cloth mask that resembles a blurred face.
“The full face mask Pixelhead acts as media camouflage, completely shielding the head to ensure that your face is not recognisable on photographs taken in public places without securing permission,” creator Martin Backes writes.
An artist designed a toolkit of avant-garde makeup and styling tips that can make faces unrecognizable to AI.
A makeup technique known as CV Dazzle, first pioneered by the artist Adam Harvey, uses fashion to combat facial recognition. It was recently featured at a workshop at the Coreana Museum of Art in Seoul, pictured here.
CV Dazzle combines makeup, hair extensions, accessories, and gems to transform people’s faces.
The technique gets its name from a World War I tactic – naval vessels were painted with black and white stripes, making it harder for distant ships to tell their size and the direction they were pointed.
Sanne Weekers, a design student in the Netherlands, created a headscarf decorated with faces intended to confuse algorithms.
“By giving an overload of information software gets confused, rendering you invisible,” Weekers wrote of the scarf.
Belgian scientists developed a prototype for a graphic print that could be added to clothing to “attack” and baffle surveillance technology.
Belgian computer scientists Simen Thys, Wiebe Van Ranst, and Toon Goedemé designed “adversarial patches” as part of a study funded by KU Leuven.
“We believe that, if we combine this technique with a sophisticated clothing simulation, we can design a T-shirt print that can make a person virtually invisible for automatic surveillance cameras,” the researchers wrote.
An artist created masks that evade facial recognition and send a message about invasions of privacy.
“‘Facial Weaponization Suite’ protests against biometric facial recognition – and the inequalities these technologies propagate – by making ‘collective masks’ in workshops that are modelled from the aggregated facial data of participants, resulting in amorphous masks that cannot be detected as human faces by biometric facial recognition technologies,” creator Zach Blas writes.
Blas’ masks also explore the potential of algorithm-driven facial recognition to enact bias and produce false positives.
Blas intended the masks pictured above to depict the “tripartite conception of blackness: the inability of biometric technologies to detect dark skin as racist, the favouring of black in militant aesthetics, and black as that which informatically obfuscates,” he writes.