Selfies are being ripped apart by an AI-driven web experiment that uses a huge image database to classify pictures of people.
From “timid defenceless simpleton” to “insignificant student”, the online project ImageNet Roulette has handed out brutal assessments to an increasingly long list of users keen to experiment.
The web page launched as part of Training Humans, a photography exhibition conceived by Professor Kate Crawford and artist Trevor Paglen.
The gallery contains several collections of pictures used by scientists to train AI in how to “see and categorise the world”, and ImageNet Roulette is based on this research.
The tech has been trained using the existing ImageNet database and is designed to be a “peek into the politics of classifying humans in machine learning systems and the data they are trained on”.
It has since gone viral on social media, with huge numbers of users ignoring a warning that the AI “regularly classifies people in dubious and cruel ways”.
While some have been left flattered by being assigned descriptors like “enchantress”, others have been told they fall into categories like “offender” and “rape suspect”.
In a bid to explain why people might receive unflattering designations, a post on the site says they are all based on existing data already assigned to pictures in the ImageNet database.
The original database was developed in 2009 by scientists at Princeton and Stanford universities in the US, and has since assigned more than 20,000 categories across millions of images.
ImageNet Roulette is “meant in part to demonstrate how various kinds of politics propagate through technical systems, often without the creators of those systems even being aware of them”.
The page also states that it “does not store the photos people upload or any other data” – reassuring those who may have been put off by privacy concerns surrounding other recent picture-driven internet phenomena.
Earlier this year, hundreds of thousands of people began to share their photos from FaceApp, which alters selfies to make them look older, younger, or to change their gender or hair style.
Some users expressed fears over its terms and conditions allowing the app to collect data from phones, and a claim that its parent company was based in Russia and had received funds from the Russian government.