Human Generated Data

Title

Untitled (man handing a woman a picnic basket from a boat)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10404

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man handing a woman a picnic basket from a boat)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10404

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Watercraft 99.5
Vessel 99.5
Vehicle 99.5
Transportation 99.5
Person 97.6
Human 97.6
Boat 89.2
Person 88.5
Dinghy 84
Rowboat 83.5
Shorts 57.6
Clothing 57.6
Apparel 57.6
Outdoors 56.6

Clarifai
created on 2023-10-26

people 99.3
two 98
man 97.8
water 97.1
recreation 96.9
adult 96.7
vehicle 94.6
monochrome 93.9
one 93.6
lake 92.1
rowboat 91.9
river 91.7
summer 91.3
leisure 91
watercraft 90.9
seat 90.9
tree 88.2
boat 87.6
woman 86.5
transportation system 86.4

Imagga
created on 2022-01-09

barrow 74.5
handcart 60
wheeled vehicle 47.4
vehicle 30.4
water 30
boat 25
sunset 20.7
lake 20.1
sea 19.5
outrigger 18.5
beach 18.1
paddle 18.1
ocean 17.6
man 17.5
fishing 17.3
people 17.3
silhouette 16.6
outdoors 16
oar 16
conveyance 15.9
sport 15.8
river 15.1
male 14.9
stabilizer 14.8
summer 14.8
device 14.4
fisherman 14.2
recreation 13.4
activity 13.4
sky 12.8
landscape 12.6
shore 12.5
leisure 12.5
tree 12.3
travel 12
outdoor 11.5
active 10.8
reflection 10.6
person 10.4
sun 9.7
couple 9.6
light 9.4
holiday 9.3
relax 9.3
peaceful 9.2
vacation 9
calm 8.2
park 8.2
exercise 8.2
sunlight 8
scene 7.8
wave 7.8
adult 7.8
old 7.7
dusk 7.6
walking 7.6
sunrise 7.5
fish 7.3
lawn mower 7.3
romantic 7.1
women 7.1

Microsoft
created on 2022-01-09

text 98.9
tree 98
water 95.5
outdoor 95
black and white 88.2
lake 86.1
boat 68
monochrome 61.5
person 60.8
watercraft 54.2
old 42.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 88.9%
Sad 21.9%
Angry 19.3%
Happy 18.8%
Fear 15.4%
Surprised 14.6%
Calm 6.7%
Disgusted 2.2%
Confused 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 97.6%
Boat 89.2%

Text analysis

Amazon

42938

Google

42938 |MJIन -- YT3৭A°2--
42938
|MJIन
--
YT3৭A°2--