Human Generated Data

Title

Untitled (girl throwing life preserver off dock)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8355

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (girl throwing life preserver off dock)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8355

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.7
Human 98.7
Outdoors 98.3
Nature 98
Shelter 97.6
Building 97.6
Countryside 97.6
Rural 97.6
Housing 97
House 88.1
Clothing 77.8
Apparel 77.8
Portrait 64.3
Face 64.3
Photography 64.3
Photo 64.3
Hut 60
Female 56.8
Cabin 55.4
Dress 55.1

Clarifai
created on 2023-10-25

people 99.9
child 98.7
two 98.5
man 97.8
adult 97.8
vehicle 97.5
monochrome 97.2
group 96.8
group together 96.6
water 95.7
boy 95.6
recreation 95.3
one 94.6
home 94.3
watercraft 94.1
three 91.7
fisherman 89.3
house 89
family 88.5
bucket 85

Imagga
created on 2022-01-09

shovel 53.8
hand tool 30.4
tool 30
cleaning implement 27.4
man 23.5
cleaner 22.5
swab 20.8
male 17
destruction 16.6
person 16.1
protection 15.5
water 15.4
nuclear 13.6
fisherman 13.4
people 13.4
silhouette 13.3
danger 12.7
fire iron 12.7
protective 12.7
landscape 12.6
travel 12
outdoors 12
industrial 11.8
radioactive 11.8
dirty 11.8
radiation 11.7
disaster 11.7
sunset 11.7
adult 11.6
sport 11.6
mask 11.6
chemical 11.6
gas 11.6
sky 11.5
outdoor 11.5
stalker 10.9
dark 10.9
accident 10.7
toxic 10.7
environment 10.7
industry 10.3
safety 10.1
clothing 9.8
building 9.7
black 9.6
broom 9.5
men 9.5
smoke 9.3
beach 9.3
old 9.1
summer 9
activity 9
river 8.9
military 8.7
cloud 8.6
tree 8.5
leisure 8.3
horizon 8.1
sea 7.8
steam 7.8
cold 7.8
fishing 7.7
winter 7.7
sunrise 7.5
ocean 7.5
park 7.4
alone 7.3
sun 7.2
suit 7.2
transportation 7.2
sunlight 7.1
equipment 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

outdoor 99.4
text 98.2
water 82.9
black and white 68.6
house 51.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Female, 76.2%
Angry 46.2%
Calm 17.5%
Sad 12.6%
Confused 10%
Fear 6.9%
Happy 3.1%
Surprised 2.7%
Disgusted 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Text analysis

Amazon

KODVK

Google

><>
><>