Human Generated Data

Title

Untitled (women in woods, search for missing child)

Date

1960

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18747

Human Generated Data

Title

Untitled (women in woods, search for missing child)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18747

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Vegetation 99.9
Plant 99.9
Woodland 99.9
Forest 99.9
Land 99.9
Outdoors 99.9
Tree 99.9
Nature 99.9
Person 99.7
Human 99.7
Grove 99.5
Clothing 99.2
Apparel 99.2
Person 98.6
Person 97.4
Grass 97.2
Dress 96.9
Shoe 86.2
Footwear 86.2
Yard 83.8
Female 83.3
People 69.8
Meal 67.7
Food 67.7
Park 66.8
Lawn 66.8
Woman 66
Jungle 65.9
Kid 64.8
Child 64.8
Shoe 64
Hat 62.8
Girl 62.2
Photography 60.2
Photo 60.2
Face 59.6
Play 58.9
Costume 58.2
Pants 58
Shorts 57.8
Path 56.7
Field 56

Clarifai
created on 2023-10-22

people 99.8
child 99.2
monochrome 97.6
group 96.3
adult 95.4
group together 94.4
boy 93.7
man 93.6
wear 90.8
family 89.6
woman 89.6
son 89
black and white 88.6
interaction 88.5
war 84.4
girl 83.8
recreation 81.8
two 80.1
administration 79.2
home 78.8

Imagga
created on 2022-03-05

swing 66.3
mechanical device 53.8
plaything 52.8
mechanism 40.1
newspaper 33.7
product 26
pedestrian 21.8
man 21.5
creation 20.1
water 19.3
sunset 18
sun 17.7
light 15.4
dark 14.2
ocean 14.1
person 14
danger 13.6
fog 13.5
sky 13.4
male 12.8
people 12.8
destruction 12.7
beach 12.6
silhouette 12.4
forest 12.2
protection 11.8
gas 11.6
tree 11.5
park 11.5
sea 10.9
summer 10.9
industrial 10.9
stalker 10.9
dirty 10.8
radioactive 10.8
radiation 10.8
accident 10.7
toxic 10.7
wet 10.7
protective 10.7
nuclear 10.7
chemical 10.6
mask 10.5
landscape 10.4
peaceful 10.1
environment 9.9
sunlight 9.8
outdoors 9.7
autumn 9.7
couple 9.6
dusk 9.5
winter 9.4
smoke 9.3
old 9.1
morning 9
black 9
trees 8.9
misty 8.8
sand 8.7
dawn 8.7
mist 8.7
cold 8.6
clothing 8.5
travel 8.4
portrait 8.4
outdoor 8.4
adult 8.4
safety 8.3
scenery 8.1
fantasy 8.1
suit 8.1
camouflage 7.8
darkness 7.8
soldier 7.8
disaster 7.8
season 7.8
scene 7.8
military 7.7
protect 7.7
industry 7.7
woods 7.6
relax 7.6
path 7.6
groom 7.5
sunrise 7.5
fun 7.5
boat 7.4
road 7.2
recreation 7.2
cool 7.1
happiness 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

outdoor 99.2
tree 98
text 93.8
black and white 91.2
clothing 90.1
person 87.2
footwear 81.5
black 67.6
monochrome 58
old 47.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 59.1%
Calm 50.6%
Sad 32%
Confused 5.1%
Disgusted 4.8%
Fear 2.4%
Angry 2.3%
Happy 1.8%
Surprised 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.7%
Person 98.6%
Person 97.4%
Shoe 86.2%
Shoe 64%