Human Generated Data

Title

A destitute family, Ozark Mountains area

Date

1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3074

Human Generated Data

Title

A destitute family, Ozark Mountains area

People

Artist: Ben Shahn, American 1898 - 1969

Date

1935

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Human 99.2
Person 99.2
Person 98.9
Person 95
People 75.8
Theme Park 75.6
Amusement Park 75.6
Outdoors 74.6
Nature 66.2
Face 62.7
Jaw 58.5
Vehicle 58.5
Transportation 58.5
Clothing 56.6
Apparel 56.6
Photo 56.2
Portrait 56.2
Photography 56.2

Imagga
created on 2021-12-15

man 22.2
people 21.7
statue 21.3
family 17.8
happy 17.5
old 17.4
person 16.5
grandfather 16
sculpture 15.8
kin 15.6
male 15.6
love 14.2
mother 13.9
adult 13.7
smile 13.5
religion 13.4
portrait 12.9
outdoors 12.7
child 12.5
sitting 12
history 11.6
park 11.6
elderly 11.5
outdoor 11.5
together 11.4
art 11.1
summer 10.9
seat 10.9
smiling 10.8
sky 10.8
clothing 10.7
couple 10.4
senior 10.3
monument 10.3
bench 9.9
fun 9.7
parent 9.7
car 9.5
antique 9.5
military uniform 9.5
outside 9.4
happiness 9.4
culture 9.4
lifestyle 9.4
architecture 9.4
face 9.2
vehicle 9
kid 8.9
boy 8.7
sibling 8.6
sit 8.5
father 8.5
travel 8.4
hand 8.3
joy 8.3
uniform 8.3
tourism 8.2
looking 8
ancient 7.8
retired 7.7
men 7.7
attractive 7.7
son 7.6
stone 7.6
casual 7.6
park bench 7.6
relax 7.6
historic 7.3
building 7.3
aged 7.2
home 7.2
childhood 7.2
hair 7.1
grass 7.1
day 7.1
autumn 7
modern 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

outdoor 99.9
person 99.7
text 98.5
clothing 97.3
human face 86
man 82.5
smile 71.1
old 46.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Female, 51.6%
Sad 72.6%
Calm 13.4%
Angry 4.4%
Fear 4%
Confused 3.9%
Surprised 0.7%
Happy 0.6%
Disgusted 0.4%

AWS Rekognition

Age 23-35
Gender Female, 84.2%
Calm 97.3%
Angry 0.9%
Happy 0.8%
Surprised 0.3%
Sad 0.2%
Confused 0.2%
Disgusted 0.2%
Fear 0%

AWS Rekognition

Age 27-43
Gender Male, 96.9%
Calm 99.2%
Sad 0.4%
Angry 0.1%
Fear 0.1%
Confused 0.1%
Surprised 0.1%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 0-3
Gender Male, 92.7%
Calm 91.2%
Sad 2.6%
Confused 2.1%
Angry 1.9%
Surprised 1.2%
Fear 0.4%
Happy 0.4%
Disgusted 0.3%

Microsoft Cognitive Services

Age 49
Gender Male

Microsoft Cognitive Services

Age 45
Gender Male

Microsoft Cognitive Services

Age 6
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a man sitting on a bench 76.2%
an old photo of a man 76.1%
a man sitting on a boat 61.7%