Human Generated Data

Title

Untitled (performers wearing women's clothes)

Date

1946

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19417

Human Generated Data

Title

Untitled (performers wearing women's clothes)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1946

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.2
Person 99.2
Person 98.5
Person 98.1
Person 96.8
Person 95.3
Apparel 94.2
Clothing 94.2
Person 90.6
Gown 82.9
Evening Dress 82.9
Fashion 82.9
Robe 82.9
Hair 69.6
Costume 67.9
People 65.8
Person 64.4
Female 64.3
Crowd 58.5

Imagga
created on 2022-03-05

people 24
person 23.3
wind instrument 22.2
musical instrument 22
brass 20.9
fashion 19.6
man 18.1
adult 17.5
cap 17.1
clothing 16.9
shower cap 16
attractive 14.7
portrait 14.2
stage 13.5
black 13.4
headdress 12.8
face 12.8
style 12.6
model 12.4
lady 12.2
hair 11.9
dress 11.7
couple 11.3
happy 11.3
old 11.1
trombone 11.1
women 11.1
costume 11
happiness 11
performer 10.7
sexy 10.4
outfit 10.2
elegance 10.1
art 10
male 9.9
pretty 9.8
human 9.7
body 9.6
celebration 9.6
standing 9.5
party 9.4
dark 9.2
dancer 9.2
kin 8.7
men 8.6
smile 8.5
platform 8.5
outdoor 8.4
traditional 8.3
vintage 8.3
girls 8.2
stylish 8.1
history 8
posing 8
lifestyle 7.9
love 7.9
together 7.9
eyes 7.7
expression 7.7
bride 7.7
hand 7.6
world 7.6
joy 7.5
musician 7.5
wedding 7.3
hat 7.3
new 7.3
activity 7.2
glass 7.1
work 7.1
sword 7
entertainer 7
modern 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

wall 97.5
indoor 90.1
clothing 89
dress 88.9
text 86.7
person 85.5
dance 85
woman 83
standing 79.2
posing 74.3
old 54.6
sketch 53.9

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 83%
Confused 76.4%
Sad 20.7%
Calm 1.4%
Disgusted 0.4%
Happy 0.4%
Surprised 0.3%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 43-51
Gender Male, 66%
Confused 42.9%
Happy 33.7%
Calm 10%
Sad 5.8%
Disgusted 2.1%
Angry 1.9%
Surprised 1.9%
Fear 1.7%

AWS Rekognition

Age 37-45
Gender Male, 80.2%
Happy 56%
Calm 14.1%
Sad 12.8%
Confused 6%
Surprised 5.1%
Angry 3.1%
Disgusted 2%
Fear 1%

AWS Rekognition

Age 48-54
Gender Male, 97.4%
Sad 69.8%
Surprised 8%
Confused 7.9%
Fear 3.5%
Happy 3.3%
Calm 3.2%
Angry 2.5%
Disgusted 1.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a group of people posing for a photo 94.8%
a group of people posing for a picture 94.7%
a group of people posing for the camera 94.6%