Human Generated Data

Title

Untitled ("Twin Exposure": Steinmetz pointing at Steinmetz as he salutes)

Date

1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10688

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled ("Twin Exposure": Steinmetz pointing at Steinmetz as he salutes)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10688

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.7
Human 99.7
Person 99.3
Clothing 94
Apparel 94
Furniture 88.4
Sitting 74.3
Couch 72.1
Shoe 70.1
Footwear 70.1
Pants 70.1
Back 63.3
Screen 61.1
Electronics 61.1
Meal 61.1
Food 61.1
Shorts 60.6
Undershirt 59.2
Monitor 58.8
Display 58.8
Skin 58.5
Musician 56.2
Musical Instrument 56.2

Clarifai
created on 2023-10-26

people 99.8
monochrome 95.6
man 95.5
two 94.8
woman 93.6
portrait 91.9
room 91.9
adult 91.8
art 91
shadow 89.2
child 89.1
one 87.4
indoors 86.9
furniture 82.4
music 80.7
family 80.2
dancer 80
position 79.9
sit 79.5
girl 78.8

Imagga
created on 2022-01-15

person 33.6
musical instrument 29.9
adult 29.8
dancer 28.9
people 26.8
performer 25.8
wind instrument 24.2
man 22.9
attractive 20.3
harmonica 19.6
teacher 19.4
sexy 19.3
male 19.2
black 18.5
body 18.4
dark 18.4
model 17.9
professional 17.8
posing 17.8
fashion 16.6
lady 16.2
portrait 16.2
passion 16
free-reed instrument 15.9
educator 15.9
one 15.7
style 15.6
business 15.2
hair 15.1
silhouette 14.9
entertainer 14.9
studio 14.4
looking 14.4
human 14.2
keyboard instrument 14.2
room 13.9
accordion 13.9
pretty 13.3
erotic 13.2
office 13
men 12.9
sensual 12.7
dance 12.7
chair 12.6
lifestyle 12.3
couple 12.2
suit 12.1
love 11.8
businessman 11.5
sitting 11.2
sensuality 10.9
happy 10.7
indoors 10.5
seductive 10.5
executive 10.3
women 10.3
elegance 10.1
indoor 10
performance 9.6
happiness 9.4
youth 9.4
sunset 9
fun 9
brunette 8.7
dancing 8.7
corporate 8.6
elegant 8.6
pleasure 8.5
casual 8.5
leisure 8.3
makeup 8.2
teenager 8.2
dress 8.1
romance 8
interior 8
clothing 7.8
motion 7.7
wife 7.6
legs 7.6
relaxation 7.5
water 7.3
alone 7.3
blond 7.3
make 7.3
gorgeous 7.3
pose 7.3
music 7.2
stylish 7.2
night 7.1
face 7.1
stringed instrument 7
sky 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

building 99.5
text 97.7
black and white 91.3
person 87.9
clothing 86.8
man 83.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 92.3%
Happy 63.1%
Sad 11.6%
Confused 7.7%
Calm 4.2%
Surprised 4%
Angry 3.3%
Disgusted 3.1%
Fear 3%

AWS Rekognition

Age 27-37
Gender Male, 76%
Calm 98.7%
Happy 0.6%
Surprised 0.3%
Sad 0.1%
Fear 0.1%
Confused 0.1%
Disgusted 0.1%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 70.1%

Categories

Text analysis

Amazon

21546.
-

Google

21546.
21546.