8 methods for bypassing cameras and facial recognition software
Facial recognition systems are becoming more sophisticated. When we post a photo on Facebook accompanied by our closest friend, it is almost certain that the system will identify our faces and suggest us to add a tag. According to ethical hacking specialists, governments around the world and private companies are driving the creation of a massive facial database that could be useful for various purposes, a practice too invasive for people’s privacy.
Surveillance is the main motivation of governments for the use of facial recognition. An example is airport security; countries such as the United States and Brazil are implementing this technology to prevent terrorist attacks. Other governments, such as China, are paving face recognition for the systematic surveillance and repression of some minority groups.
Below, ethical hacking specialists from the International Institute of Cyber Security (IICS) list several methods to bypass facial recognition systems and protect our privacy.
A group of designers started a project they have defined as “facial jewelry”. One of the designs consists of two brass circles that hang from the cheekbones and cross the forehead of the wearer; in addition to being a fashion trend, it is designed to prevent facial recognition software from detecting the person’s face. These designs have proven to be functional against Facebook face detection system.
According to experts in ethical hacking, Japan’s National Institute of Computing developed these lenses that include a set of nearly infrared (invisible to the human eye) LED lights that project a layer of light to protect the user against detection most surveillance and monitoring equipment. These glasses were strategically designed to illuminate the area around the user’s eyes and nose, making it impossible for infrared light-sensitive cameras to detect their face.
Thanks to the initiative of Scott Urban, a small Businessman from the U.S. the idea of ‘Reflectacles’, a pair of anti-facial recognition glasses, was born. For example, the Reflectacles Ghost model is able to reflect the invisible and infrared lights projected against the user’s face, so cameras that work with this technology will be unable to record the face features of the wearer.
Another part of the Reflectacles Project, the IRpair anti-facial recognition goggles feature specially designed optical filters to prevent infrared radiation from reaching the user, allowing free passage of the natural light spectrum. In addition, IRpair’s technology also prevents practices such as iris scanning, three-dimensional mapping of facial features and infrared lights, making the cameras perceive your face just as an empty space.
A pair of conventional sunglasses does not protect us against facial recognition systems, as ink-obscured lenses become clear with the use of infrared light. Instead, IRpair lenses turn dark in the presence of infrared light, completely blocking the capabilities of such cameras using this technology, ethical hacking experts mention.
Phantom goggles technology prevents facial recognition by using a material capable of reflecting infrared lights, as well as preventing infrared security cameras from performing biometric analysis of your face. Like IRpair glasses, the frame of Phantom glasses only reflects the infrared spectrum, allowing visible to human eye light passing through.
Entrepreneur Scott Urban thought of an alternative approach to the IRpair and Phantom glasses, which are based on preventing three-dimensional mapping and reflection of infrared lights. The IRclip glasses allow the user to choose between the lenses called “IRdark” and “IRlight”, whose functions depend on the lighting conditions in which the user is located. Whether exposed to too much light, or there is not a single light illuminating it, these goggles will prevent the proper functioning of biometric sensing systems.
Another option to prevent facial detection systems from infringing our privacy is to overload them with images that these systems detect as faces, hiding the true face of the user. This is the idea that motivated the entrepreneurs of the Hyperface Project, who have created prints for clothes and accessories with “fake faces”. The use of Hyperface clothing will make the work of detection systems more difficult, making monitoring and monitoring more complex.
Before posting on Facebook, apply FaceShield to your photos
Users may want to share their photos online, which is fine, but we can do this without exposing ourselves to facial recognition software detection. For this, ethical hacking experts recommend resorting to using a tool called FaceShield. This software works as a filter that the user must apply to their photos before sharing them on a social media platform; its developers claim that by modifying minimal elements in photography, users’ faces will be harder to detect for machine learning tools on sites like Facebook. These changes are virtually invisible to the human eye.
Originally published at https://www.securitynewspaper.com on July 18, 2019.