Technology

Campaign launched to ban “Gender recognition tech” due to its implications for trans individuals

Ahsan Zafeer Written by Ahsan Zafeer · 1 min read>

Dangers posed by facial recognition like mass surveillance and mistaken identity have been widely discussed in recent years. But digital rights groups say an equally insidious use case is currently sneaking under the radar: using the same technology to predict someone’s gender. Now, a new campaign has been launched to ban technologies that conduct gender recognition. More than 60 NGOs have sent a letter to the European Commission, asking it to ban this technology.

“Trying to predict someone’s gender from digitized clues is fundamentally flawed”, says Os Keyes, a researcher who’s written extensively on the topic. This technology tends to reduce gender to a simplistic binary and, as a result, is often harmful to trans people who might not fit into these narrow categories.

When the resulting systems are used for things like limiting entry for physical spaces or verifying someone’s identity for an online service, it leads to discrimination.

Commercial facial recognition systems, including those sold by big tech companies like Amazon and Microsoft, frequently offer gender classification as a standard feature.

With facial recognition tech, if someone has short hair, they’re categorized as male; if they’re wearing makeup, they’re female. Similar assumptions are made based on biometric data like bone structure and face shape. The result is that people who don’t fit easily into these two categories — like many trans individuals — face discrimination.

“These systems don’t just fail to recognize that trans people exist. They literally can’t recognize that trans people exist,” says Keyes.

Current applications of this gender recognition tech include digital billboards that analyze passersby to serve them targeted advertisements; digital spaces like “girls-only” social app Giggle, which admits people by guessing their gender from selfies; and marketing stunts, like a campaign to give discounted subway tickets to women in Berlin to celebrate Equal Pay Day that tried to identify women based on facial scans. Researchers have also discussed much more potentially complex use cases, like deploying the technology to limit entry to gendered areas like bathrooms and locker rooms.

Being rejected by a machine in such a scenario has the potential to be not only humiliating and inconvenient but also to trigger an even more severe reaction from others present.

Ultimately, technology that tries to reduce the world to binary classifications based on simple heuristics will always fail when faced with the variety and complexity of human expression, inevitably hurting those who are already marginalized by society.

Source: The Verge

Written by Ahsan Zafeer
A digital marketing professional specializing in content-based functional areas - Ahsan Zafeer is driven by a never-ending passion for developing, nurturing, and strategizing key content aspects. He writes extensively on tech, digital marketing, SEO, cybersecurity, and emerging technologies. He also serves as a digital marketing strategist and freelance consultant for globally oriented organizations. He tweets @AhsanZafeer Profile