Human Generated Data

Title

Untitled (wedding guests standing near cake table next to house)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8750

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (wedding guests standing near cake table next to house)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Home Decor 99.7
Person 97.7
Human 97.7
Person 96.3
Person 96.1
Person 94
Chair 85.9
Furniture 85.9
Window 83.7
Chair 79.2
Person 73.9
Person 70.6
Outdoors 67.3
Person 64.8
Curtain 64.7
Urban 64.2
Art 62.2
Person 62.1
Nature 61.2
Clothing 59.3
Apparel 59.3
Chair 59.1
Building 58.5
Window Shade 56.9
Shutter 56.2
Person 53.8
Person 47.3

Imagga
created on 2022-01-09

man 24.2
musical instrument 22.4
silhouette 19.9
people 19.5
person 18.8
male 16.4
chair 14.6
sky 12.7
sunset 12.6
outdoors 12.1
sun 12.1
adult 11.9
wind instrument 11.6
water 11.3
men 11.2
black 10.9
dark 10.9
outdoor 10.7
business 10.3
sax 9.8
stringed instrument 9.7
accordion 9.7
landscape 9.7
couple 9.6
sitting 9.4
day 9.4
relax 9.3
bowed stringed instrument 9.1
human 9
working 8.8
women 8.7
beach 8.5
keyboard instrument 8.5
building 8.5
portrait 8.4
power 8.4
summer 8.4
protection 8.2
night 8
love 7.9
device 7.8
happiness 7.8
sea 7.8
dusk 7.6
smoke 7.4
style 7.4
park 7.4
light 7.4
lake 7.3
danger 7.3
group 7.3
lifestyle 7.2
romance 7.1
travel 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 94.3
outdoor 86.6
black and white 86.2
clothing 75.9
person 74.8
white 67.9

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Male, 93.2%
Calm 58.9%
Sad 33.4%
Confused 3.3%
Fear 1.2%
Happy 1%
Angry 0.9%
Surprised 0.8%
Disgusted 0.6%

AWS Rekognition

Age 21-29
Gender Male, 98.2%
Happy 35.9%
Calm 27.3%
Angry 20.3%
Sad 8.9%
Fear 4%
Disgusted 2.1%
Surprised 0.9%
Confused 0.6%

AWS Rekognition

Age 23-31
Gender Male, 87.6%
Calm 97.1%
Sad 1.1%
Surprised 0.6%
Fear 0.4%
Confused 0.3%
Happy 0.2%
Disgusted 0.2%
Angry 0.1%

Feature analysis

Amazon

Person 97.7%
Chair 85.9%

Captions

Microsoft

a group of people standing in front of a building 76.5%
a group of people in front of a building 76.4%
a group of people standing outside of a building 76.3%

Text analysis

Amazon

5
38600
د8

Google

58 YT37A°2-XAdO
58
YT37A°2-XAdO