Human Generated Data

Title

Untitled (wedding guests standing near cake table)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8747

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (wedding guests standing near cake table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 97.3
Human 97.3
Person 97.3
Person 90
Person 87.2
Person 87
Person 86.4
Person 84.8
Person 84.6
Person 82.2
Person 80.3
Person 78.8
Poster 77.2
Advertisement 77.2
Urban 72.1
Person 71.6
Person 70.8
Meal 69.7
Food 69.7
People 68.5
Outdoors 68.2
Person 61.6
Nature 60.6
Person 60.5
Person 57.9
Clothing 57.8
Apparel 57.8
Photography 57.8
Photo 57.8
Face 56.2
Person 49

Imagga
created on 2022-01-09

musical instrument 31.5
accordion 17.3
man 16.1
wind instrument 15.5
person 15.4
old 14.6
keyboard instrument 14
adult 12.4
people 12.3
black 11.4
shop 11.2
holiday 10.7
clothing 10.7
bass 10.5
shopping cart 10.5
style 10.4
wheeled vehicle 10.3
grunge 10.2
business 9.7
urban 9.6
male 9.3
shopping 9.2
studio 9.1
stringed instrument 9.1
music 9
equipment 8.9
happy 8.8
play 8.6
men 8.6
store 8.5
house 8.3
device 8.3
one 8.2
dirty 8.1
handcart 8.1
standing 7.8
cart 7.8
portrait 7.8
outdoors 7.6
power 7.5
fashion 7.5
buy 7.5
city 7.5
vintage 7.4
guitar 7.4
basket 7.3
seller 7.3
art 7.2
hair 7.1
women 7.1
market 7.1
child 7.1
day 7.1
bowed stringed instrument 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

outdoor 98.5
black and white 93.6
person 93.5
people 80
text 73.4
group 72.1
man 63.1
clothing 62.3
monochrome 51.3
crowd 25.1

Face analysis

Amazon

AWS Rekognition

Age 16-24
Gender Female, 55.6%
Sad 49.3%
Calm 34.8%
Fear 5.5%
Surprised 4.4%
Confused 3.6%
Disgusted 1.4%
Angry 0.5%
Happy 0.4%

AWS Rekognition

Age 21-29
Gender Female, 83.1%
Calm 82.4%
Happy 16.4%
Sad 0.6%
Surprised 0.2%
Disgusted 0.1%
Confused 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 16-24
Gender Female, 61.8%
Calm 63.9%
Angry 28.1%
Sad 4.5%
Happy 1.4%
Surprised 0.7%
Confused 0.6%
Fear 0.4%
Disgusted 0.4%

Feature analysis

Amazon

Person 97.3%
Poster 77.2%

Captions

Microsoft

a group of people standing in front of a crowd 84.5%
a group of people walking down a street 84.4%
a group of people walking down the street 84.3%

Text analysis

Amazon

5
38594
٢8
YТ3-X

Google

38S94 58 YT37A°2- AO
38S94
58
YT37A°2-
AO