Human Generated Data

Title

Untitled (two men getting their shoes polished in park)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7488

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two men getting their shoes polished in park)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.5
Person 99.5
Person 99.2
Apparel 91.6
Clothing 91.6
Furniture 83.8
Text 78.6
Food 74.9
Meal 74.9
People 69
Outdoors 67.9
Urban 67.5
Table 67
Poster 66.4
Advertisement 66.4
Wood 65.8
Nature 65.4
Sitting 64
Plant 63.4
Tree 63.4
Female 62.1
Building 60.7
Town 60.7
City 60.7
Monitor 57
Screen 57
Display 57
Electronics 57
Leisure Activities 56.1

Imagga
created on 2022-01-08

deck 56.9
boat 32.9
sea 32
water 28.7
ship 26.2
vessel 25.9
boats 24.3
travel 23.9
ocean 23.2
transportation 19.7
harbor 18.3
marina 17.4
transport 17.3
tourism 16.5
yacht 16
sailing 15.6
loom 15.4
port 15.4
vacation 14.7
nautical 14.6
machine 14.1
summer 14.1
sky 14
sailboat 13.7
bay 13.2
industrial 12.7
dock 12.6
textile machine 12.6
man 12.1
luxury 12
outdoors 11.9
beach 11.8
newspaper 11.6
industry 11.1
pier 10.9
leisure 10.8
cruise 10.7
sail 10.7
device 10.2
holiday 10
city 10
moored 9.9
old 9.7
marine 9.5
product 9.2
tourist 9.1
black 9
people 8.9
work 8.9
river 8.9
craft 8.8
ships 8.8
building 8.8
wave 8.6
construction 8.6
journey 8.5
power 8.4
relaxation 8.4
equipment 8.1
coast 8.1
metal 8
yachts 7.9
architecture 7.8
person 7.7
passenger 7.6
waves 7.4
town 7.4
business 7.3
sun 7.2
creation 7.1
steel 7.1

Microsoft
created on 2022-01-08

text 99.9
newspaper 77.9
black and white 64.8

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 60.6%
Calm 97.8%
Sad 2.1%
Happy 0.1%
Confused 0%
Angry 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 100%
Happy 0%
Surprised 0%
Sad 0%
Disgusted 0%
Angry 0%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a man sitting on a boat 26.8%
a man on a boat 26.7%
a man in a newspaper 26.6%

Text analysis

Amazon

28716B.
are
KODAK-SVEELA

Google

287163,
287163, YT37A2-XA
YT37A2-XA