Machine Learning Roulette

  

Selfies are being ripped apart by an AI-driven web experiment that uses a huge image database to classify pictures of people.

  1. Machine Learning Roulette App
  2. Machine Learning Roulette Games
  3. Video Roulette Machines In Casinos

From 'timid defenceless simpleton' to 'insignificant student', the online project ImageNet Roulette has handed out brutal assessments to an increasingly long list of users keen to experiment.

The web page launched as part of Training Humans, a photography exhibition conceived by Professor Kate Crawford and artist Trevor Paglen.

  1. The Cabinet of Wisdom. Part I; Part II; Part III; Part IV; Part V; Part VI: Learning about Machine Learning. Hexapawn is a simple game, simple enough that the mathematician Martin Gardner used it in famous column from the March 1962 issue of Scientific American to explain the basics of what is called machine learning to the public. Let us establish at the outset that machine learning.
  2. Generated by ImageNet Roulette, a project by researcher Kate Crawford and artist Trevor Paglen, the labels draw on AI technology to replicate how machine learning systems analyze images. Using an open-source, deep-learning frame ImageNet Roulette matched uploaded images with categories that already exist in a training set called ImageNet.

Ever wonder how algorithms trained on human classification categories type you? Thanks to this new tool from @katecrawford and @trevorpaglen’s “Training Humans” project now you can: https://t.co/ESrpzyjtxU

— J.D. Schnepf (@jd_schnepf) September 15, 2019

weird flex but ok #imagenetpic.twitter.com/0EWCoZzmhz

— Chid Gilovitz (@chidakash) September 16, 2019LearningMachine

The gallery contains several collections of pictures used by scientists to train AI in how to 'see and categorise the world', and ImageNet Roulette is based on this research.

Or, rather, the dataset ImageNet Roulette draws from, ImageNet, is filled with problematic categories that reflect the bias often inherent in the large datasets that make machine learning possible.

The tech has been trained using the existing ImageNet database and is designed to be a 'peek into the politics of classifying humans in machine learning systems and the data they are trained on'.

Advertisement

It has since gone viral on social media, with huge numbers of users ignoring a warning that the AI 'regularly classifies people in dubious and cruel ways'.

While some have been left flattered by being assigned descriptors like 'enchantress', others have been told they fall into categories like 'offender' and 'rape suspect'.

Machine learning routing

More from Science & Tech

I am flattered by ImageNet's classification of me pic.twitter.com/6yHE3vESyZ

— sᴛᴇʟʟᴀ (@computerpupper) September 16, 2019

📯 mortal soul, available for recitals.
(via ImageNet https://t.co/6wDgGC9cXH) pic.twitter.com/jWwIRtqyeu

— Craig (@craig88) September 16, 2019

In a bid to explain why people might receive unflattering designations, a post on the site says they are all based on existing data already assigned to pictures in the ImageNet database.

The original database was developed in 2009 by scientists at Princeton and Stanford universities in the US, and has since assigned more than 20,000 categories across millions of images.

ImageNet Roulette is 'meant in part to demonstrate how various kinds of politics propagate through technical systems, often without the creators of those systems even being aware of them'.

Hmmm. Not sure what I make of this ImageNet algorithm... pic.twitter.com/PTCVevgfCJ

Machine Learning Roulette App

Machine— Thomas Maidment (@maidment_thomas) September 16, 2019

The page also states that it 'does not store the photos people upload or any other data' - reassuring those who may have been put off by privacy concerns surrounding other recent picture-driven internet phenomena.

Earlier this year, hundreds of thousands of people began to share their photos from FaceApp, which alters selfies to make them look older, younger, or to change their gender or hair style.

Some users expressed fears over its terms and conditions allowing the app to collect data from phones, and a claim that its parent company was based in Russia and had received funds from the Russian government.

Selfies are being ripped apart by an AI-driven web experiment that uses a huge image database to classify pictures of people.

From 'timid defenceless simpleton' to 'insignificant student', the online project ImageNet Roulette has handed out brutal assessments to an increasingly long list of users keen to experiment.

The web page launched as part of Training Humans, a photography exhibition conceived by Professor Kate Crawford and artist Trevor Paglen.

Ever wonder how algorithms trained on human classification categories type you? Thanks to this new tool from @katecrawford and @trevorpaglen’s “Training Humans” project now you can: https://t.co/ESrpzyjtxU

— J.D. Schnepf (@jd_schnepf) September 15, 2019

weird flex but ok #imagenetpic.twitter.com/0EWCoZzmhz

— Chid Gilovitz (@chidakash) September 16, 2019

The gallery contains several collections of pictures used by scientists to train AI in how to 'see and categorise the world', and ImageNet Roulette is based on this research.

Machine

The tech has been trained using the existing ImageNet database and is designed to be a 'peek into the politics of classifying humans in machine learning systems and the data they are trained on'.

Advertisement

It has since gone viral on social media, with huge numbers of users ignoring a warning that the AI 'regularly classifies people in dubious and cruel ways'.

While some have been left flattered by being assigned descriptors like 'enchantress', others have been told they fall into categories like 'offender' and 'rape suspect'.

More from Science & Tech

I am flattered by ImageNet's classification of me pic.twitter.com/6yHE3vESyZ

— sᴛᴇʟʟᴀ (@computerpupper) September 16, 2019

Machine Learning Roulette Games

📯 mortal soul, available for recitals.
(via ImageNet https://t.co/6wDgGC9cXH) pic.twitter.com/jWwIRtqyeu

— Craig (@craig88) September 16, 2019

In a bid to explain why people might receive unflattering designations, a post on the site says they are all based on existing data already assigned to pictures in the ImageNet database.

The original database was developed in 2009 by scientists at Princeton and Stanford universities in the US, and has since assigned more than 20,000 categories across millions of images.

ImageNet Roulette is 'meant in part to demonstrate how various kinds of politics propagate through technical systems, often without the creators of those systems even being aware of them'.

Hmmm. Not sure what I make of this ImageNet algorithm... pic.twitter.com/PTCVevgfCJ

— Thomas Maidment (@maidment_thomas) September 16, 2019

The page also states that it 'does not store the photos people upload or any other data' - reassuring those who may have been put off by privacy concerns surrounding other recent picture-driven internet phenomena.

Earlier this year, hundreds of thousands of people began to share their photos from FaceApp, which alters selfies to make them look older, younger, or to change their gender or hair style.

Video Roulette Machines In Casinos

Some users expressed fears over its terms and conditions allowing the app to collect data from phones, and a claim that its parent company was based in Russia and had received funds from the Russian government.