Human Generated Data

Title

Untitled (people at beach with trees, Emerald Harbors, Florida)

Date

1959

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10457

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people at beach with trees, Emerald Harbors, Florida)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.7
Human 99.7
Person 98.9
Nature 98.8
Outdoors 98.2
Person 95.2
Plant 93
Tree 93
Rural 90.9
Building 90.9
Shelter 90.9
Countryside 90.9
Person 89.7
Ice 88.9
Woodland 87.4
Grove 87.4
Vegetation 87.4
Land 87.4
Forest 87.4
Snow 80.7
Yard 75.4
Person 71.7
Camping 67.5
Flower 65.9
Blossom 65.9
Tent 65.8
Frost 57.6

Imagga
created on 2022-01-09

landscape 37.2
tree 33.9
snow 32.1
trees 31.2
sky 27.6
thatch 21.9
barn 21.7
forest 20.9
winter 20.4
clouds 20.3
weather 20.1
park bench 18.8
sunset 18
bench 17.8
old 17.4
structure 16.9
roof 16.9
rural 16.8
building 16.6
cold 16.4
park 15.7
fog 15.5
scenery 15.3
farm building 15.1
field 15.1
scenic 14.9
scene 14.7
outdoors 14.3
river 14.2
grass 14.2
dark 14.2
countryside 13.7
hay 13.7
protective covering 13.4
season 13.3
sunrise 13.1
sun 13
mountain 12.6
lake 12.5
light 12
seat 12
travel 12
morning 11.8
wood 11.7
dawn 11.6
outdoor 11.5
black 11.4
country 11.4
cloud 11.2
texture 11.1
fence 10.9
environment 10.7
evening 10.3
mountains 10.2
branch 10
water 10
foggy 9.8
covering 9.8
autumn 9.7
fodder 9.5
grunge 9.4
picket fence 9.3
peaceful 9.2
vintage 9.1
land 9.1
road 9
horizon 9
night 8.9
snowy 8.8
freeze 8.8
natural 8.7
antique 8.7
negative 8.7
frost 8.6
bright 8.6
plant 8.3
vacation 8.2
fall 8.2
feed 8
frosty 7.8
overcast 7.8
mist 7.7
wilderness 7.7
dusk 7.6
serene 7.5
pattern 7.5
film 7.4
furniture 7.2

Google
created on 2022-01-09

Nature 89.9
Black 89.6
Plant 86.2
Tree 86.1
Black-and-white 84.8
Adaptation 79.3
Tints and shades 77.4
Monochrome 75.4
Monochrome photography 75.4
Art 74.8
Wood 71.6
Rectangle 70.5
Trunk 67.7
Room 65.6
Visual arts 65
Twig 64.5
Conifer 64.4
Font 61.8
Landscape 59.5
Winter 59.2

Microsoft
created on 2022-01-09

tree 98.6
text 98.4
outdoor 97.5
black and white 83.1
snow 65.6

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Male, 78.6%
Calm 64.3%
Sad 24.4%
Fear 4.5%
Happy 2.7%
Confused 1.9%
Disgusted 0.9%
Angry 0.7%
Surprised 0.5%

AWS Rekognition

Age 20-28
Gender Female, 62.4%
Calm 55.7%
Sad 23.5%
Happy 9.7%
Angry 3.3%
Fear 3.2%
Disgusted 2.1%
Confused 1.5%
Surprised 0.9%

AWS Rekognition

Age 6-14
Gender Female, 61.4%
Happy 41.3%
Calm 16.4%
Sad 15.8%
Fear 7.3%
Surprised 7.1%
Confused 6.1%
Angry 3.8%
Disgusted 2.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Tent 65.8%

Captions

Microsoft

an old photo of a horse 70.5%
an old photo of a person 53.2%
old photo of a person 46.3%

Text analysis

Amazon

43917
VAGON

Google

43917
43917