Human Generated Data

Title

Untitled (man and woman pruning plant outside of trailer home)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8866

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman pruning plant outside of trailer home)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8866

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Nature 99.3
Outdoors 98.3
Person 97.6
Human 97.6
Person 95.3
Ice 89.5
Vegetation 78.5
Plant 78.5
Snow 78
Yard 77.1
Grove 76.3
Land 76.3
Woodland 76.3
Tree 76.3
Forest 76.3
Countryside 76.1
Weather 75.3
Road 63.3
Housing 62.7
Building 62.7
Rural 61.6
Frost 57.2

Clarifai
created on 2023-10-26

people 99.7
group together 97.8
adult 97.8
street 97.1
man 96.8
group 96.6
home 96.3
monochrome 95.7
war 94.9
soldier 94.3
military 93.4
woman 92.9
many 92.1
vehicle 90.9
two 90.8
administration 90.3
flame 90.2
calamity 90.1
police 89.2
weapon 88.2

Imagga
created on 2022-01-15

structure 42.4
landscape 35
sky 34
mobile home 31.4
billboard 29.9
wheeled vehicle 25.5
trailer 25
housing 24.8
signboard 24.2
clouds 23.7
snow 22.1
cloud 21.5
sunset 20.7
trees 19.6
tree 18.8
weather 18.1
vehicle 17.9
travel 17.6
old 15.3
rural 15
water 14.7
winter 14.5
scenery 14.4
sunrise 14.1
park 14
field 13.4
dark 13.4
season 13.3
scenic 13.2
environment 13.2
forest 13.1
lake 12.9
summer 12.9
mountain 12.9
river 12.5
scene 12.1
light 12
countryside 11.9
outdoor 11.5
country 11.4
cold 11.2
mountains 11.1
road 10.8
black 10.8
horizon 10.8
fog 10.6
storm 10.6
grass 10.3
smoke 10.2
architecture 10.2
motor vehicle 10.2
tourism 9.9
car 9.9
vacation 9.8
sun 9.7
outdoors 9.4
hill 9.4
building 9.3
house 9.2
silhouette 9.1
overcast 8.8
dusk 8.6
climate 8.5
stage 8.3
new 8.1
history 8.1
natural 8
night 8
autumn 7.9
day 7.8
color 7.8
dawn 7.7
mist 7.7
plant 7.7
pollution 7.7
industry 7.7
frost 7.7
power 7.6
cloudy 7.5
land 7.5
air 7.4
reflection 7.3
industrial 7.3
fall 7.2
morning 7.2
television 7.2
truck 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

tree 99.7
text 97.4
black and white 89.4
monochrome 50.5
old 40.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 99.3%
Sad 42.8%
Calm 38.1%
Confused 9.4%
Fear 2.8%
Surprised 2.7%
Disgusted 1.7%
Angry 1.4%
Happy 1.1%

AWS Rekognition

Age 21-29
Gender Male, 90.6%
Calm 86.9%
Sad 10.4%
Happy 0.9%
Surprised 0.7%
Angry 0.5%
Confused 0.3%
Disgusted 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.6%

Categories

Captions

Microsoft
created on 2022-01-15

an old photo of a person 61.5%
old photo of a person 59.6%

Text analysis

Amazon

39850

Google

398 50
398
50