A password will be e-mailed to you.

A new, female-only social media platform—Giggle—is being marketed as a safe space for women. The app uses a 3D scan of the users’ faces in order to verify them as females. But there’s one problem: it doesn’t work if you’re trans. Facial recognition technology has been in the news before for not being able to correctly identify gender nonconforming faces and non-white faces as well.

Facial recognition AI is trained by being fed a catalog of faces and taught to measure and categorize characteristics. Yet most of these programs are fed primarily cisgender, white faces.

In October of 2019, a study from CU Boulder reported on the potential bias of facial recognition. The study reports that the technology can accurately classify cisgender men 97 percent of the time and cisgender women 98 percent of the time. This same technology misgendered trans men as women 38 percent of the time, and nonbinary people 100 percent of the time. This was because the software only recognizes gender binaries. Similarly, these programs misidentify black people 10 times more frequently than white people.

Besides this being a bummer for trans girls who’d like to be included in Giggle, this prejudice can lead to setbacks for the LGBTQ community in politics and legislation. Facial recognition is actively used by Facebook, Apple, and Amazon, as well as law enforcement. So far, this technology is not currently being regulated by state or federal government, meaning that companies and law enforcement agencies can use this technology for surveillance totally unregulated.

At the very least, we have Giggle to thank for bringing these issues to LGBTQ community members’ attention.  In the months and years to come, it will be up to the courts and lawmakers to determine a balance between information and privacy. For the time being, social media platforms like Giggle need to reassess this exclusionary and problematic technology.