Human Generated Data

Title

Untitled (women wearing dresses, onstage)

Date

1946

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19140

Human Generated Data

Title

Untitled (women wearing dresses, onstage)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1946

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.8
Human 98.8
Person 98.8
Apparel 98.1
Clothing 98.1
Person 97.8
Person 97.8
Person 96.2
Person 94.6
Dress 84.1
Female 80.8
Stage 71.4
Sleeve 68.8
Long Sleeve 67.3
Gown 64.8
Robe 64.8
Fashion 64.8
Evening Dress 64.8
Woman 62.5
People 61.1
Head 59.1
Crowd 57
Face 55.2

Imagga
created on 2022-03-05

brass 63
wind instrument 52.1
musical instrument 39
man 29.5
cornet 29.3
people 28.4
male 25.5
couple 24.4
adult 21.6
happy 20.7
business 20
businessman 19.4
person 18.8
professional 18.7
men 18
group 16.1
women 15.8
smiling 14.5
kin 14.2
happiness 14.1
outfit 14
corporate 13.7
dress 13.5
family 13.3
teacher 13.1
team 12.5
job 12.4
groom 12.3
smile 12.1
office 12
love 11.8
bride 11.7
wedding 11
two 11
portrait 11
day 11
black 10.4
teamwork 10.2
children 10
holding 9.9
fashion 9.8
executive 9.5
bouquet 9.4
meeting 9.4
suit 9
classroom 8.9
building 8.9
home 8.8
boy 8.7
child 8.7
boss 8.6
marriage 8.5
youth 8.5
clothing 8.5
worker 8.5
performer 8.4
pretty 8.4
attractive 8.4
mature 8.4
businesswoman 8.2
new 8.1
to 8
lifestyle 7.9
together 7.9
uniform 7.9
work 7.8
standing 7.8
hands 7.8
education 7.8
full length 7.7
diversity 7.7
married 7.7
thinking 7.6
adults 7.6
mother 7.4
celebration 7.2
holiday 7.2
singer 7.1
handsome 7.1
indoors 7
modern 7

Microsoft
created on 2022-03-05

dress 97.4
clothing 96.1
person 95.9
woman 93
standing 92.2
window 87.9
text 87.6
posing 83.6
wedding dress 74.2
old 73.9
group 63.1
clothes 19.7

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 94%
Happy 58%
Calm 22.8%
Disgusted 12%
Confused 2.1%
Surprised 2%
Sad 1.3%
Angry 1.2%
Fear 0.4%

AWS Rekognition

Age 37-45
Gender Male, 96%
Calm 37%
Happy 25.7%
Confused 15.2%
Surprised 8.4%
Angry 5.7%
Disgusted 5%
Sad 2.5%
Fear 0.5%

AWS Rekognition

Age 52-60
Gender Male, 99.9%
Happy 71.9%
Calm 15.9%
Surprised 9.7%
Angry 0.8%
Disgusted 0.7%
Sad 0.3%
Confused 0.3%
Fear 0.2%

AWS Rekognition

Age 49-57
Gender Male, 92.6%
Happy 91.3%
Surprised 3.6%
Sad 3%
Calm 0.7%
Disgusted 0.5%
Confused 0.4%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 45-53
Gender Male, 99.8%
Happy 97.4%
Sad 1.2%
Confused 0.3%
Calm 0.3%
Surprised 0.2%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 26-36
Gender Female, 88.2%
Calm 65.4%
Sad 15.9%
Happy 7.3%
Surprised 6.7%
Fear 2%
Disgusted 1.6%
Angry 0.8%
Confused 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%

Captions

Microsoft

a group of people posing for a photo 96.2%
a group of people posing for a picture 96.1%
a group of people posing for the camera 96%

Text analysis

Amazon

113
YT37A2
M
Ancona
M 113 YT37A2 00201
00201

Google

ME YT33A2 022
022
ME
YT33A2