Human Generated Data

Title

Untitled ("Twin Exposure": man seated at table responding to men with a gun)

Date

1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10687

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled ("Twin Exposure": man seated at table responding to men with a gun)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.6
Human 99.6
Person 99.5
Person 99.4
Clothing 95.9
Apparel 95.9
Beverage 95.4
Drink 95.4
Bottle 91.4
Alcohol 89.8
Nature 74.7
Outdoors 71.7
Beer 67
Face 65.6
Table 64
Furniture 64
Portrait 63.3
Photography 63.3
Photo 63.3
Hat 62.8
Pants 62.7
Chair 62.2
Liquor 56.8
Beer Bottle 56.7

Imagga
created on 2022-01-15

man 30.2
musical instrument 25.3
male 24.2
people 24
person 23.1
black 21
wind instrument 17.5
men 15.5
adult 15.1
portrait 14.2
silhouette 14.1
couple 13.1
women 12.7
music 12.6
device 12.2
sax 11.5
human 11.2
love 11
musician 10.8
sport 10.7
dance 10.6
one 10.5
accordion 10.1
sunset 9.9
style 9.6
body 9.6
boy 9.6
stringed instrument 9.2
face 9.2
outdoor 9.2
hand 9.1
sexy 8.8
businessman 8.8
happy 8.8
light 8.8
clothing 8.7
concert 8.7
rock 8.7
brass 8.7
play 8.6
power 8.4
dark 8.4
world 8.3
player 8.3
vintage 8.3
holding 8.3
fun 8.2
keyboard instrument 8.1
dirty 8.1
water 8
posing 8
art 7.9
model 7.8
singer 7.8
party 7.7
youth 7.7
casual 7.6
fashion 7.5
guitar 7.5
sound 7.5
park 7.4
entertainment 7.4
business 7.3
dress 7.2
lifestyle 7.2
shadow 7.2
cool 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 94.9
person 94.7
clothing 94
bottle 93.8
man 90.6
black and white 82.2

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 90%
Happy 97.8%
Surprised 1.4%
Disgusted 0.2%
Fear 0.1%
Angry 0.1%
Sad 0.1%
Confused 0.1%
Calm 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a man standing in a room 85.8%
a man wearing a hat 77.2%
a group of people wearing costumes 53.1%

Text analysis

Amazon

21564
MJI7
MJI7 YEET A70A
adidas
moon
A70A
YEET

Google

21564. M3 YT37A 2 A33A
21564.
M3
YT37A
2
A33A