Human Generated Data

Title

Untitled (several couples taking dance lesson)

Date

1949

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15169

Human Generated Data

Title

Untitled (several couples taking dance lesson)

People

Artist: Jack Gould, American

Date

1949

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.3
Person 99.3
Person 99.2
Person 99.1
Person 99
Clothing 96.1
Apparel 96.1
Person 93.2
Dance Pose 92.8
Leisure Activities 92.8
Person 85.5
Female 83.1
Dress 77.4
People 76.5
Art 72.4
Drawing 72.4
Face 72.1
Evening Dress 68.8
Robe 68.8
Fashion 68.8
Gown 68.8
Portrait 67
Photo 67
Photography 67
Woman 66.2
Sketch 63.1
Girl 60.8
Overcoat 56.6
Suit 56.6
Coat 56.6
Dance 56
Performer 55.1

Imagga
created on 2022-03-05

people 22.9
male 22.7
person 20.9
man 20.8
adult 20.3
dress 19
wind instrument 17.5
couple 17.4
fashion 15.8
bride 15.8
horn 15.7
happy 15.7
women 15
happiness 14.9
pretty 14
musical instrument 13.8
wedding 13.8
smile 13.5
groom 13.4
attractive 13.3
model 13.2
men 12.9
professional 12.7
two 12.7
together 12.3
device 12.2
flute 12.1
clothing 12
love 11.8
silhouette 11.6
curtain 11.2
shower curtain 11.1
cornet 11.1
portrait 11
smiling 10.8
brass 10.7
fun 10.5
party 10.3
teacher 10.1
lady 9.7
black 9.6
standing 9.6
hair 9.5
woodwind 9.2
hand 9.1
outdoors 9
cheerful 8.9
alarm 8.9
sexy 8.8
businessman 8.8
celebration 8.8
lifestyle 8.7
married 8.6
corporate 8.6
wife 8.5
bouquet 8.5
active 8.1
husband 8.1
gown 8.1
looking 8
outfit 7.9
flowers 7.8
hands 7.8
play 7.8
youth 7.7
marriage 7.6
power 7.6
clothes 7.5
sport 7.4
business 7.3
pose 7.2
body 7.2
team 7.2
building 7.1
blind 7.1
family 7.1
posing 7.1
face 7.1
furnishing 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

wall 96.9
dance 96.1
text 94.7
clothing 84
person 82.3
dress 63.1
woman 59.8
black and white 53.6
dressed 29.8
clothes 18.3

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 97.8%
Surprised 88.8%
Calm 8.8%
Fear 0.8%
Disgusted 0.7%
Angry 0.3%
Confused 0.2%
Sad 0.1%
Happy 0.1%

AWS Rekognition

Age 34-42
Gender Male, 93.8%
Calm 95.9%
Happy 1.3%
Surprised 0.8%
Sad 0.6%
Angry 0.5%
Confused 0.3%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 45-51
Gender Female, 99.7%
Calm 58%
Happy 34.2%
Sad 2.4%
Disgusted 2.4%
Angry 1%
Fear 0.8%
Surprised 0.7%
Confused 0.5%

AWS Rekognition

Age 41-49
Gender Male, 99.3%
Calm 85.9%
Fear 5.2%
Sad 4.2%
Happy 1.8%
Confused 1.4%
Surprised 0.7%
Disgusted 0.4%
Angry 0.4%

AWS Rekognition

Age 27-37
Gender Male, 88.1%
Sad 62.3%
Calm 25.1%
Fear 5.3%
Confused 3.9%
Disgusted 1.2%
Angry 1.1%
Surprised 0.7%
Happy 0.5%

AWS Rekognition

Age 23-31
Gender Female, 82%
Calm 94.7%
Sad 3.2%
Angry 0.5%
Surprised 0.5%
Happy 0.4%
Disgusted 0.4%
Confused 0.3%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a group of people standing in front of a mirror posing for the camera 50.2%
a person standing in front of a mirror posing for the camera 50.1%
a group of people in front of a mirror posing for the camera 50%