Human Generated Data

Title

Untitled (people at beach with trees, Emerald Harbors, Florida)

Date

1959

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10456

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people at beach with trees, Emerald Harbors, Florida)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.1
Person 99.1
Nature 98.8
Outdoors 98.7
Person 98.5
Ice 97.3
Plant 94.4
Tree 94.4
Person 90.4
Snow 87.7
Person 85.7
Vegetation 84.9
Woodland 80.3
Land 80.3
Grove 80.3
Forest 80.3
Yard 78.2
Camping 72.9
Meal 71.5
Food 71.5
Frost 70.7
Person 68.3
Person 68.2
Grass 64.2
Tent 64
Leisure Activities 63.1
Female 56.7
Vacation 55.4
Picnic 55.4
Winter 55.4

Imagga
created on 2022-01-09

thatch 74.1
roof 64.8
protective covering 44.9
landscape 32.7
covering 30.5
tree 26.9
sky 23.3
old 23
building 21.2
trees 16.9
clouds 16.1
rural 15.9
structure 15.7
winter 15.3
barn 15.2
outdoor 14.5
house 14.2
snow 14.1
park 14.1
scene 13.9
grunge 13.6
river 13.3
forest 13.1
water 12.7
dark 12.5
texture 12.5
hay 12.4
cloud 12.1
countryside 11.9
sunset 11.7
wood 11.7
summer 11.6
travel 11.3
antique 11.3
weather 11.1
field 10.9
scenery 10.8
vintage 10.8
environment 10.7
outdoors 10.6
country 10.5
lake 10.4
grass 10.3
mountain 9.9
farm building 9.9
vacation 9.8
scenic 9.7
black 9.6
sunrise 9.4
light 9.4
fodder 9.3
sun 8.9
art 8.5
peace 8.2
horizon 8.1
night 8
holiday 7.9
colorful 7.9
feed 7.8
architecture 7.8
cold 7.7
dawn 7.7
greenhouse 7.6
bench 7.5
hill 7.5
frame 7.5
mountains 7.4
retro 7.4
peaceful 7.3
calm 7.3
aged 7.2
dirty 7.2
road 7.2
home 7.2
plant 7.1
sea 7
season 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

tree 99.3
outdoor 97.8
text 97.5
black and white 76

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 90.1%
Calm 99.1%
Happy 0.6%
Fear 0.2%
Sad 0%
Disgusted 0%
Confused 0%
Surprised 0%
Angry 0%

AWS Rekognition

Age 22-30
Gender Male, 94.1%
Calm 85.3%
Fear 4.3%
Surprised 2.9%
Happy 2.6%
Sad 2.2%
Angry 1.7%
Disgusted 0.6%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Tent 64%

Captions

Microsoft

a group of people riding on the back of a horse 42.3%
a group of people standing next to a horse 42.2%
a group of people looking at a horse 42.1%

Text analysis

Amazon

43910

Google

43910
43910