Human Generated Data

Title

Untitled (young couple dance, she dips)

Date

1952, printed later

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.128

Human Generated Data

Title

Untitled (young couple dance, she dips)

People

Artist: Jack Gould, American

Date

1952, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Dance Pose 100
Leisure Activities 100
Human 99.7
Tango 99.7
Dance 99.7
Person 99.5
Person 99
Apparel 98.6
Footwear 98.6
Shoe 98.6
Clothing 98.6
Person 98.4
Person 97
Shoe 96.2
Person 92.1
Person 86.1
Person 85.7
Person 69.3
Person 66.2
Shoe 57.9
Shoe 56.8

Imagga
created on 2021-12-14

groom 49.9
dress 31.6
person 27.4
people 27.3
bride 22.6
adult 22.6
man 20.8
couple 20
fashion 19.6
male 18.4
wedding 18.4
happy 18.2
dancer 17.4
pretty 16.8
attractive 16.1
portrait 15.5
performer 14.3
love 14.2
clothing 14.2
happiness 14.1
lady 13.8
model 13.2
black 13.1
women 12.6
style 12.6
marriage 12.3
dance 12.3
suit 12
elegance 11.7
new 11.3
party 11.2
lifestyle 10.8
smile 10.7
posing 10.7
bouquet 10.6
fun 10.5
sexy 10.4
art 10.4
celebration 10.4
business 10.3
professional 10
face 9.9
human 9.7
bridal 9.7
together 9.6
looking 9.6
married 9.6
hair 9.5
color 9.5
men 9.4
teacher 9.4
entertainer 9.2
romantic 8.9
businessman 8.8
corporate 8.6
elegant 8.6
two 8.5
gown 8.5
old 8.4
joy 8.3
traditional 8.3
pose 8.2
cheerful 8.1
family 8
smiling 8
brunette 7.8
ceremony 7.8
educator 7.7
culture 7.7
hand 7.6
holding 7.4
church 7.4
costume 7.4
jacket 7.4
group 7.3
cute 7.2
building 7.1
interior 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

person 99
text 98.7
dance 96.4
posing 96.3
clothing 95.3
standing 92.4
man 90.4
gallery 78
black 74.6
old 74.5
woman 67.8
group 66
white 64.4
footwear 56.9
dress 53.6
vintage 26

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Male, 86.6%
Calm 79.1%
Surprised 8.6%
Angry 4.1%
Confused 3.1%
Sad 2.1%
Disgusted 1.1%
Happy 1.1%
Fear 0.8%

AWS Rekognition

Age 9-19
Gender Female, 90%
Happy 73.9%
Calm 21.5%
Angry 1.6%
Surprised 1.5%
Sad 0.7%
Disgusted 0.3%
Fear 0.3%
Confused 0.2%

AWS Rekognition

Age 29-45
Gender Male, 54.2%
Sad 42%
Calm 39.6%
Confused 12.7%
Surprised 2.3%
Angry 1%
Fear 0.9%
Happy 0.8%
Disgusted 0.8%

AWS Rekognition

Age 24-38
Gender Female, 51.4%
Calm 33.5%
Sad 18.6%
Surprised 16.6%
Fear 11.1%
Confused 7.6%
Angry 5.8%
Happy 4.7%
Disgusted 2%

AWS Rekognition

Age 32-48
Gender Male, 82.6%
Calm 90.3%
Sad 4.5%
Angry 1.8%
Surprised 1.5%
Disgusted 0.7%
Happy 0.5%
Confused 0.5%
Fear 0.2%

AWS Rekognition

Age 19-31
Gender Male, 85.1%
Sad 78.8%
Calm 19.5%
Happy 0.5%
Confused 0.4%
Angry 0.3%
Fear 0.3%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 21-33
Gender Female, 93.5%
Calm 85.7%
Sad 12.4%
Happy 0.6%
Angry 0.6%
Confused 0.3%
Fear 0.2%
Surprised 0.2%
Disgusted 0.1%

AWS Rekognition

Age 23-37
Gender Male, 53.4%
Calm 49.7%
Sad 33.2%
Happy 10.6%
Confused 3.4%
Angry 1.4%
Fear 0.7%
Disgusted 0.5%
Surprised 0.4%

AWS Rekognition

Age 28-44
Gender Male, 66.4%
Fear 50%
Angry 36.7%
Calm 5.6%
Surprised 2.6%
Sad 2.5%
Confused 1.1%
Disgusted 0.8%
Happy 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Shoe 98.6%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 96.3%
a vintage photo of a group of people posing for a picture 96.2%
a group of people posing for a photo 96.1%