How Surveillance Tech Aids Enforcement of Anti-Trans Laws
[ad_1]
From the grocery store to a sporting event, the airport to the shopping mall, facial recognition software has become increasingly widely used by a variety of vendors and entities in public spaces. The Automated Gender Recognition
technology baked into many facial recognition software programs could be used for enforcing anti-trans laws, according to Keyes.
“The main scenario … that you can imagine is cameras — either pre-existing ones or ones that are newly placed on the outside of gendered bathrooms — being loaded with software that has them flag anyone who goes into those bathrooms or appears to be who is incongruent with the expectation. So someone who the system classifies as a man goes into a woman’s bathroom,” they said.
Keyes pointed to a report by the National Institute of Standards and Technology, a federal agency that oversees government technological development and innovation, as evidence of this possibility. The report includes a section titled “Gender Verification,” where it details how this technology could be used: “Gender-targeted surveillance,” the report notes, “can assist with monitoring” areas including the “entrance to female restrooms or locker rooms.” It has already been used with blanket applications for gender-specific marketing at a London bus stop, and in Berlin was used to automatically give women subway discounts on Equal Pay Day.
Jake Laperruque, deputy director of the Security and Surveillance Project at the Center for Democracy and Technology and former POGO senior policy counsel, thinks blanket uses of facial recognition, such as scanning the face of every person entering a bathroom, are less likely than particularized use by law enforcement in investigations, such as using facial recognition to identify a specific performer in a drag show based on photos or videos. He said this is especially the case for performers who may seek to avoid identification by using a pseudonym — facial recognition technology makes it that much harder to avoid being surveilled. “If you’re actually trying to take defensive measures against surveillance, you can leave your phone at home. If you’re going to a place where you’re worried about surveillance, you cannot leave your face at home,” Laperruque said.
As with other forms of surveillance, the harms of video surveillance tend to be more acutely felt by those with marginalized status. Communities of color have long been exposed to heightened police surveillance, from FBI tracking of members of the civil rights movement to CIA spying on Muslim communities in the wake of 9/11. More recently, public housing residents have experienced heightened video surveillance, with footage used to punish and sometimes evict residents — 45% of public housing households are Black, according to a 2012 report.
Surveillance technology can also have disproportionate impacts on communities and people of color on city streets: A ProPublica
analysis found that people who lived in Chicago zip codes with majority Black and Latino residents were twice as likely to receive traffic camera-related tickets than those who lived in primarily white zip codes.
These impacts can be compounded for transgender people of color, and laws targeting trans people in public spaces have clear racial dimensions, according to Scott from the Lavender Rights Project. “I think about drag as a Black queer and trans art,” she said, explaining drag’s origins in ballroom culture. “Drag is really rooted in Blackness, Black transness, and Black queerness. … It is one of the few places, socially, that it has been acceptable for us to show up as we are, to show our art and creativity and have it appreciated and not be harassed, targeted, and harmed.”
Scott said bathroom bills will also disproportionately impact Black transgender women by ramping up criminalization. Binsfeld echoed this concern. “Trans visibility has always been a double-edged sword, because for many trans people visibility hasn’t meant safety, especially folks who are multiply marginalized,” they said.
Increased federal regulation around facial recognition would be a meaningful way to blunt some of the potential harms of surveillance technology with anti-trans laws, Keyes and Laperruque agree. There are currently no federal regulations on law enforcement use of facial recognition technology, even as it has become widely used by police and federal agencies. “Facial recognition — we have some state laws and a few city bans — but for the most part it’s the Wild West in terms of how it’s used,” Laperruque said. “I’d be very worried that these types of tools do not have the guardrails they need on them, and that can lead to abuse, that can lead to large-scale monitoring. And when we’re talking about these types of laws that are designed and targeted at vulnerable communities, that creates a lot of risk.”
He pointed to the Facial Recognition Act of 2022, a bill introduced by lawmakers to add limits to the use of facial recognition by police, as a measure that could have counteracted some potential enforcement harms of anti-trans laws linked to video surveillance. The law would only permit facial recognition technology to be used by law enforcement in serious violent felony investigations: Most of the anti-trans laws that carry criminal penalties are designated as lower degree felonies or misdemeanors. The measure has not been reintroduced in the current session.
[ad_2]
Source link