in

The Battle for Biometric Privacy


In 2024, increased adoption of biometric surveillance systems, such as the use of AI-powered facial recognition in public places and access to government services, will spur biometric identity theft and anti-surveillance innovations. Individuals aiming to steal biometric identities to commit fraud or gain access to unauthorized data will be bolstered by generative AI tools and the abundance of face and voice data posted online.

Already, voice clones are being used for scams. Take for example, Jennifer DeStefano, a mom in Arizona who heard the panicked voice of her daughter crying “Mom, these bad men have me!” after receiving a call from an unknown number. The scammer demanded money. DeStefano was eventually able to confirm that her daughter was safe. This hoax is a precursor for more sophisticated biometric scams that will target our deepest fears by using the images and sounds of our loved ones to coerce us to do the bidding of whoever deploys these tools.

In 2024, some governments will likely adopt biometric mimicry to support psychological torture. In the past, a person of interest might be told false information with little evidence to support the claims other than the words of the interrogator. Today, a person being questioned may have been arrested due to a false facial recognition match. Dark-skinned men in the United States, including Robert Williams, Michael Oliver, Nijeer Parks, and Randal Reid, have been wrongfully arrested due to facial misidentification, detained and imprisoned for crimes they did not commit. They are among a group of individuals, including the elderly, people of color, and gender nonconforming individuals, who are at higher risk of facial misidentification.

Generative AI tools also give intelligence agencies the ability to create false evidence, like a video of an alleged coconspirator confessing to a crime. Perhaps just as harrowing is that the power to create digital doppelgängers will not be limited to entities with large budgets. The availability of open-sourced generative AI systems that can produce humanlike voices and false videos will increase the circulation of revenge porn, child sexual abuse materials, and more on the dark web.

By 2024 we will have growing numbers of “excoded” communities and people—those whose life opportunities have been negatively altered by AI systems. At the Algorithmic Justice League, we have received hundreds of reports about biometric rights being compromised. In response, we will witness the rise of the faceless, those who are committed to keeping their biometric identities hidden in plain sight.

Because biometric rights will vary across the world, fashion choices will reflect regional biometric regimes. Face coverings, like those used for religious purposes or medical masks to stave off viruses, will be adopted as both fashion statement and anti-surveillance garments where permitted. In 2019, when protesters began destroying surveillance equipment while obscuring their appearance, a Hong Kong government leader banned face masks.

In 2024, we will start to see a bifurcation of mass surveillance and free-face territories, areas where you have laws like the provision in the proposed EU AI Act, which bans the use of live biometrics in public places. In such places, anti-surveillance fashion will flourish. After all, facial recognition can be used retroactively on video feeds. Parents will fight to protect the right for children to be “biometric naive”, which is to have none of their biometrics such as faceprint, voiceprint, or iris pattern scanned and stored by government agencies, schools, or religious institutions. New eyewear companies will offer lenses that distort the ability for cameras to easily capture your ocular biometric information, and pairs of glasses will come with prosthetic extensions to alter your nose and cheek shapes. 3D printing tools will be used to make at-home face prosthetics, though depending on where you are in the world, it may be outlawed. In a world where the face is the final frontier of privacy, glancing upon the unaltered visage of another will be a rare intimacy.

Synthetic Data Is a Dangerous Teacher

Synthetic Data Is a Dangerous Teacher

It’s No Wonder People Are Getting Emotionally Attached to Chatbots