Human Generated Data

Title

Untitled (man at water pump)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7755

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man at water pump)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7755

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Clothing 99.9
Apparel 99.9
Person 99.1
Human 99.1
Coat 98.4
Raincoat 93.2

Clarifai
created on 2023-10-25

people 99.7
adult 98.9
one 98.2
two 97.1
monochrome 96.3
wear 94.9
man 94.4
vehicle 92.5
portrait 92.1
outfit 89.9
retro 89.7
military 88.4
administration 87.6
actor 87.4
war 87.4
microphone 86.4
music 81.7
gun 79.7
street 78
facial expression 77.9

Imagga
created on 2022-01-09

device 28.8
statue 17
person 16.9
megaphone 14.9
sculpture 14.5
adult 14.2
telephone 13.7
man 12.9
acoustic device 12.6
city 12.5
people 12.3
equipment 12
black 12
building 12
art 11.8
portrait 11.6
old 11.1
male 10.6
human 10.5
urban 10.5
hair 10.3
sexy 9.6
ancient 9.5
architecture 9.4
dark 9.2
pretty 9.1
weapon 9
microphone 8.8
instrument 8.7
antique 8.6
model 8.6
wall 8.5
face 8.5
electronic equipment 8.3
one 8.2
religion 8.1
body 8
attractive 7.7
fashion 7.5
fountain 7.5
vintage 7.4
holding 7.4
tourism 7.4
style 7.4
lady 7.3
danger 7.3
sensuality 7.3
dress 7.2
lifestyle 7.2
posing 7.1
interior 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 97.8
black and white 88.8
person 88.1
clothing 84.5
man 82.3
black 68.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Female, 65.8%
Happy 41.8%
Calm 27.4%
Surprised 20.8%
Disgusted 4.2%
Fear 1.9%
Sad 1.4%
Angry 1.3%
Confused 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Categories

Captions

Microsoft
created on 2022-01-09

a man standing in front of a building 81.9%
an old photo of a man 81.8%
old photo of a man 81.7%

Text analysis

Amazon

24557.
YТ3-X

Google

YT37A2-XAGO
YT37A2-XAGO