News

This EU-funded AI judges your face and tells you how “normal” you are

Written by Hamza Zakir ·  1 min read >

Getting judged by society is bad enough, so you can imagine what it feels like to have an AI system expertly analyze your face and give it a rating for attractiveness. Don’t worry though; unlike the anxiety-inducing remarks you might get from people, this AI is actually judging you for a greater social cause.

Facial recognition is rampant with errors and biases, be it its problematic preference for the fairer skin or its inability to treat each facial image equally. And of course, there are the privacy concerns. In this regard, a new website called How Normal Am I? is using algorithms to judge users’ age, attractiveness, BMI, life expectancy, and gender.

The website was created by SHERPA, an EU-funded project that explores the impact of AI on ethics and human rights. In an interview with The Next Web, artist-in-residence at SHERPA Tijmen Schep explains and showcases the interesting system in detail.

The first thing the system does is ask you to face the webcam so that its algorithm can analyze your face and rate it for attractiveness. Schep explains that similar algorithms are used in dating apps like Tinder to match equally attractive people and in social media platforms like TikTok to promote content made by “good-looking” people.

The point that Schep makes with the demonstration of his system is that facial analysis algorithms are incredibly dependent on the data they are trained with. Since their training data comprises thousands of photos that are manually labeled by a group of people, and perceptions of beauty vary all over the world, any such algorithm is likely to classify someone as beautiful or ugly based on how its training samples have been labeled.

If you have a low score, it might just be because the judgment of these algorithms is so dependent on how they were trained,” explained Schep. “Of course, if you got a really high score, that’s just because you are incredibly beautiful.

Another note-worthy aspect of facial analysis systems is the ease with which they can be manipulated and deceived. For instance, when Schep’s system began analyzing its subject for age, the subject was able to fool it to identify him as ten years younger than he actually was by simply shaking his head.

Schep believes that facial recognition technology has given us that odd feeling of being “watched” all the time, especially as it continues to become a bigger part of our lives. He hopes to use his system to create more awareness around the long-term risks associated wit such technology, which prominently include the loss of our right to privacy.

You might feel more pressure to behave ‘normally’, which for an algorithm means being more average. That’s why we have to protect our human right to privacy, which is essentially our right to be different. You could say that privacy is a right to be imperfect,” Schep said.

Written by Hamza Zakir
Platonist. Humanist. Unusually edgy sometimes. Profile