Human Generated Data

Title

Untitled (Harvard Hasty Pudding Club: men waving from train station gate)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8202

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Harvard Hasty Pudding Club: men waving from train station gate)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8202

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Wheel 99
Machine 99
Person 95.9
Human 95.9
Person 94.7
Person 94.6
Clothing 92.6
Apparel 92.6
Vehicle 86.5
Transportation 86.5
Bike 84.7
Bicycle 81.9
Wheel 78.8
Person 78.5
Coat 68.6
Overcoat 68.2
People 61.9
Person 61.9
Leisure Activities 59
Portrait 58.8
Photography 58.8
Face 58.8
Photo 58.8

Clarifai
created on 2023-10-25

people 99.9
monochrome 99.8
woman 98.6
adult 97.8
street 97.6
two 95.9
man 95.9
group 95.7
group together 93.8
wear 92.9
transportation system 90.2
wedding 88.6
child 85.6
carriage 84.2
administration 83.7
vehicle 82.9
wheelchair 82
gown (clothing) 81.1
art 80.1
war 79.1

Imagga
created on 2022-01-08

pay-phone 47
telephone 40.5
device 37.7
hand blower 35.7
electronic equipment 30.6
dryer 29.5
blower 28.5
appliance 26.2
equipment 22.8
adult 16.8
person 16.8
durables 16.3
home 15.9
people 15.6
interior 15
house 14.2
indoors 14
man 12.8
pretty 12.6
power 12.6
portrait 12.3
male 12
black 12
industry 11.9
lady 11.3
attractive 11.2
technology 11.1
window 11.1
indoor 10.9
room 10.6
television 10.4
business 10.3
office 10
fashion 9.8
cable 9.6
sexy 9.6
lifestyle 9.4
holding 9.1
human 9
kitchen 8.9
refrigerator 8.8
station 8.7
communication 8.4
working 7.9
support 7.8
standing 7.8
life 7.8
door 7.8
model 7.8
consumer goods 7.7
modern 7.7
gas 7.7
hand 7.6
looking 7.2
cute 7.2
transportation 7.2
pump 7.1
face 7.1
travel 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99
person 93.3
black and white 79.4
screenshot 73.8
people 63.8
cartoon 58

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Sad 71.9%
Surprised 9.6%
Disgusted 9.5%
Confused 4.6%
Calm 1.9%
Angry 1.1%
Happy 1%
Fear 0.5%

AWS Rekognition

Age 37-45
Gender Male, 89.6%
Surprised 45.7%
Confused 19.7%
Sad 13.4%
Happy 9.4%
Calm 5.6%
Fear 4.1%
Disgusted 1.2%
Angry 1%

AWS Rekognition

Age 37-45
Gender Male, 100%
Sad 78.6%
Confused 10.5%
Angry 7.1%
Surprised 1.4%
Disgusted 1.4%
Happy 0.6%
Fear 0.3%
Calm 0.2%

Feature analysis

Amazon

Wheel 99%
Person 95.9%
Bicycle 81.9%

Text analysis

Amazon

TRENTON
SOUTH
LOCAL
PHILADELPHIA
STATIONS
AND
ETWEEN TRENTON AND PHILADELPHIA
EST. SOUTH AND LOCAL STATIONS
ETWEEN
6585
EST.

Google

658€ OPILADELPHIA EST.SOUTH ETMEEN TRENTON AMDPHILADELPHIA PLOCAL STATIONS AND AND
658€
OPILADELPHIA
EST.SOUTH
ETMEEN
TRENTON
AMDPHILADELPHIA
PLOCAL
STATIONS
AND