Human Generated Data

Title

Untitled (group of people eating at a picnic table, Rose Valley, PA)

Date

c. 1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12027

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of people eating at a picnic table, Rose Valley, PA)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 99.8
Person 99.8
Person 95.5
Person 95.3
Person 91.8
Person 90.5
Clothing 87.1
Apparel 87.1
Person 73
Coat 67.7
Overcoat 67.7
Suit 67.7
Ground 65.9
Leisure Activities 62.6
Musical Instrument 62.6
Piano 62.6
Meal 62.3
Food 62.3
Plant 60.6
Furniture 59.5
Child 59
Kid 59
Female 57.4
Grass 55.8

Imagga
created on 2022-01-15

photographer 29.3
man 26.9
landscape 23.1
people 20.1
outdoors 19.4
stretcher 18.5
sunset 18
beach 17.9
sky 17.9
sport 16.9
gun 16.8
water 16.7
rifle 16.5
male 16.3
outdoor 16.1
litter 14.8
travel 14.8
person 14.6
ocean 14.3
sea 14.1
protection 13.6
musical instrument 13.6
silhouette 13.2
brass 13
trombone 12.9
lake 12.8
conveyance 12.6
mountain 12.5
adult 11.7
park 11.7
activity 11.6
leisure 11.6
military 11.6
adventure 11.4
shore 11.2
summer 10.9
danger 10.9
coast 10.8
destruction 10.8
disaster 10.7
vacation 10.6
dusk 10.5
sun 10.5
wind instrument 10.2
recreation 9.9
soldier 9.8
firearm 9.6
men 9.4
lifestyle 9.4
smoke 9.3
clouds 9.3
relax 9.3
weapon 9.1
river 8.9
scenic 8.8
couple 8.7
walking 8.5
boat 8.4
mountains 8.3
peaceful 8.2
percussion instrument 8.2
binoculars 8.1
mask 8.1
rock 7.8
backpack 7.8
terrain 7.8
nuclear 7.8
industry 7.7
two 7.6
desert 7.6
relaxation 7.5
dark 7.5
marimba 7.5
active 7.4
device 7.4
environment 7.4
sand 7.4
calm 7.3
industrial 7.3
horizon 7.2
romantic 7.1
equipment 7.1
day 7.1

Microsoft
created on 2022-01-15

black and white 96.4
person 95.7
text 93
outdoor 90.5
clothing 85.5
man 77.9
monochrome 72.6
funeral 64.1
grave 62.3

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 99%
Sad 65.5%
Calm 32.9%
Confused 0.5%
Happy 0.4%
Angry 0.3%
Disgusted 0.3%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Piano 62.6%

Captions

Microsoft

a group of people sitting on a bench 79.3%
a group of people that are sitting on a bench 70.3%
a group of people sitting at a bench 70.2%