Human Generated Data

Title

Untitled (man and woman pruning plant outside of trailer home)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8865

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman pruning plant outside of trailer home)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8865

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99
Human 99
Nature 96.4
Outdoors 93.5
Train 79.2
Transportation 79.2
Vehicle 79.2
Vegetation 73
Plant 73
Ice 71.1
Yard 70.7
Tree 66
Housing 65.9
Building 65.9
Countryside 64.2
Snow 57.8
Weather 57.5

Clarifai
created on 2023-10-26

people 99.8
adult 97.4
street 96.4
group together 96.3
man 95.8
home 95.4
group 95.3
monochrome 95
war 93.9
calamity 92.9
soldier 92.1
vehicle 90.7
military 90.6
flame 90
woman 88.6
two 88.5
smoke 86.2
house 85.1
many 84.2
town 83.7

Imagga
created on 2022-01-15

mobile home 55.7
structure 55.2
trailer 44.5
housing 44
wheeled vehicle 37.5
landscape 32
sky 30.8
tree 28.2
vehicle 24.4
clouds 23.7
trees 22.3
billboard 20.9
old 17.4
cloud 17.2
signboard 16.9
forest 16.6
rural 15.9
snow 15.7
winter 15.3
sunset 15.3
building 15
weather 14.8
travel 14.8
mountains 13
water 12.7
field 12.6
house 12.5
architecture 12.5
lake 12.4
scenic 12.3
scene 12.1
mountain 11.9
park 11.6
fog 11.6
river 11.6
outdoor 11.5
country 11.4
sunrise 11.3
conveyance 11.2
cold 11.2
road 10.8
scenery 10.8
sun 10.5
grass 10.3
countryside 10.1
light 10
black 9.6
dark 9.2
environment 9.1
summer 9
vacation 9
autumn 8.8
roof 8.7
season 8.6
cloudy 8.4
wood 8.3
city 8.3
street 8.3
vintage 8.3
tourism 8.3
horizon 8.1
history 8.1
antique 7.8
hills 7.8
houses 7.8
mist 7.7
orange 7.7
hill 7.5
evening 7.5
smoke 7.4
outdoors 7.3
fall 7.2
morning 7.2
night 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

tree 99
text 98.3
black and white 92.4
house 84.4
old 64.3
sky 63.1
monochrome 51.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 57.6%
Calm 41.3%
Happy 29.4%
Sad 18.4%
Angry 3.2%
Confused 2.8%
Disgusted 2.4%
Surprised 1.5%
Fear 1%

AWS Rekognition

Age 16-24
Gender Male, 97.6%
Calm 71.6%
Sad 9.6%
Confused 6.6%
Disgusted 5.4%
Fear 3.3%
Surprised 2.2%
Happy 0.7%
Angry 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Train 79.2%

Captions

Text analysis

Amazon

39850-A

Google

39850-A
39850-A