Human Generated Data

Title

Untitled (people on beach next to small tree)

Date

1959

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8054

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people on beach next to small tree)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.7
Human 99.7
Person 99.5
Person 99.3
Person 97.5
Person 95.8
Person 92.9
Furniture 87.1
Chair 87.1
Grass 84.9
Plant 84.9
Person 79.7
Apparel 77.5
Clothing 77.5
Tent 71.8
Shorts 68.7
People 68.3
Outdoors 62.6
Photography 61.9
Photo 61.9
Camping 61.4
Female 61.1
Girl 58.8
Tree 55.5

Imagga
created on 2022-01-15

umbrella 46.3
canvas tent 45.9
shelter 40.6
canopy 36.9
sky 24.2
tent 23.6
travel 19
mountain tent 18.5
protective covering 18.2
sunset 18
parasol 16.4
sun 16.1
summer 14.8
clouds 14.4
outdoors 14.2
landscape 14.1
man 14.1
silhouette 14.1
beach 13.9
grass 13.4
trees 13.3
adventure 13.3
environment 13.2
outdoor 13
people 12.8
protection 12.7
water 12.7
tourism 12.4
adult 11.6
person 11.6
structure 11.4
forest 11.3
leisure 10.8
scenic 10.5
sea 10.2
parachute 10.2
tree 10.1
danger 10
ocean 10
park 9.9
coast 9.9
camp 9.8
equipment 9.8
destruction 9.8
gas 9.6
black 9.6
sunrise 9.4
field 9.2
covering 9.2
dark 9.2
island 9.2
industrial 9.1
dirty 9
road 9
fun 9
accident 8.8
protective 8.8
sand 8.7
peaceful 8.2
freedom 8.2
vacation 8.2
morning 8.1
rural 7.9
stalker 7.9
radioactive 7.9
radiation 7.8
toxic 7.8
male 7.8
scene 7.8
nuclear 7.8
chemical 7.7
hiking 7.7
mask 7.7
evening 7.5
clothing 7.5
natural 7.4
light 7.4
suit 7.2
holiday 7.2
activity 7.2
night 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.5
person 88.1
black and white 78

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 76%
Happy 96.2%
Sad 1%
Surprised 0.7%
Calm 0.6%
Angry 0.6%
Fear 0.6%
Disgusted 0.2%
Confused 0.1%

AWS Rekognition

Age 30-40
Gender Female, 87.9%
Calm 40.1%
Fear 26.6%
Happy 21.3%
Disgusted 5.6%
Sad 2.7%
Surprised 1.6%
Angry 1.3%
Confused 0.8%

AWS Rekognition

Age 20-28
Gender Female, 82%
Calm 70.2%
Sad 13.8%
Happy 11%
Fear 1.4%
Angry 1.2%
Disgusted 1.1%
Surprised 0.8%
Confused 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Chair 87.1%
Tent 71.8%

Captions

Microsoft

a group of people sitting in chairs 63.5%
a group of people sitting at a table with an umbrella 37.6%
a group of people sitting in a chair 37.5%

Text analysis

Amazon

43922

Google

43922
43922