Human Generated Data

Title

Untitled (Rainbow Cafe Teen Canteen, Chamberlain, South Dakota)

Date

1957, printed later

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1012

Human Generated Data

Title

Untitled (Rainbow Cafe Teen Canteen, Chamberlain, South Dakota)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1957, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1012

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Dance Pose 100
Leisure Activities 100
Person 99.6
Human 99.6
Person 99.2
Person 98.9
Person 98.4
Person 98.3
Person 98.2
Shoe 96.6
Clothing 96.6
Footwear 96.6
Apparel 96.6
Dance 96.5
Person 96.2
Person 94.6
Shoe 91.3
Tango 90.7
Shoe 88.6
Shoe 86.9
Shoe 72.9
Performer 58.8
Flamenco 58
Person 54.9
Person 43.9

Clarifai
created on 2023-10-26

people 99.9
dancing 99.5
dancer 99.4
man 99
woman 98
music 97.8
costume 97.3
monochrome 96.3
group 96.2
group together 96.1
dress 95.5
child 95.4
wear 92.6
performance 92.6
girl 92.4
adult 92.3
street 92.2
ballet 91.9
couple 91.2
theater 88.1

Imagga
created on 2022-01-23

sword 64.2
weapon 52.8
dancer 35.2
people 33.5
dance 31.3
group 28.2
performer 27.5
man 26.2
person 24.4
adult 23.4
men 23.2
entertainer 21
art 20.3
silhouette 19.9
male 19.1
team 17
black 16.1
business 15.2
women 14.2
creation 13.9
walking 13.3
businessman 13.2
fashion 12.1
happy 11.9
active 11.9
sport 11.8
dress 11.7
suit 11.7
competition 11
crowd 10.6
friends 10.3
teamwork 10.2
holiday 10
girls 10
attractive 9.8
fun 9.7
outdoors 9.7
standing 9.6
happiness 9.4
city 9.1
success 8.8
together 8.8
urban 8.7
outfit 8.7
dancing 8.7
work 8.6
two 8.5
joy 8.3
training 8.3
leisure 8.3
human 8.2
style 8.2
pose 8.1
model 7.8
winter 7.7
life 7.6
beach 7.6
clothing 7.6
meeting 7.5
movement 7.5
friendship 7.5
musical instrument 7.3
lifestyle 7.2
body 7.2
activity 7.2
portrait 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

dance 99.8
person 98
text 96.4
clothing 88.2
dress 79.5
footwear 78.4
dancing 69.8
old 63.6
woman 63.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Female, 99.3%
Confused 38.7%
Disgusted 35.1%
Happy 14.8%
Angry 6.9%
Calm 1.6%
Surprised 1.3%
Sad 1%
Fear 0.5%

AWS Rekognition

Age 19-27
Gender Male, 74%
Surprised 33.7%
Confused 25.6%
Calm 12.1%
Angry 8.6%
Fear 8.1%
Disgusted 7%
Sad 2.6%
Happy 2.4%

AWS Rekognition

Age 40-48
Gender Male, 97.8%
Sad 45%
Angry 29.3%
Fear 9.1%
Disgusted 8.5%
Confused 5.3%
Surprised 1.2%
Calm 1%
Happy 0.7%

AWS Rekognition

Age 18-24
Gender Female, 100%
Sad 65.5%
Calm 31.4%
Angry 0.8%
Confused 0.6%
Disgusted 0.5%
Fear 0.4%
Happy 0.4%
Surprised 0.4%

AWS Rekognition

Age 21-29
Gender Male, 93.4%
Calm 93.9%
Angry 1.7%
Sad 1.4%
Surprised 1.1%
Disgusted 0.7%
Happy 0.5%
Fear 0.4%
Confused 0.3%

AWS Rekognition

Age 16-22
Gender Male, 99.6%
Calm 93.7%
Sad 5.5%
Confused 0.2%
Angry 0.2%
Disgusted 0.1%
Happy 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 18-26
Gender Female, 90.3%
Fear 95%
Happy 1.7%
Sad 0.9%
Calm 0.7%
Surprised 0.7%
Angry 0.4%
Confused 0.3%
Disgusted 0.3%

AWS Rekognition

Age 21-29
Gender Female, 92.8%
Sad 95.1%
Fear 1.4%
Calm 0.8%
Angry 0.8%
Confused 0.8%
Disgusted 0.7%
Happy 0.3%
Surprised 0.2%

AWS Rekognition

Age 21-29
Gender Male, 80.1%
Calm 78.1%
Fear 12.8%
Sad 6.5%
Angry 1.4%
Confused 0.6%
Happy 0.3%
Surprised 0.2%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 96.6%

Text analysis

Amazon

I
I I
-
0 -
a
- 77712
0
=
- =
CRAFT
77712