Human Generated Data

Title

The Thaxton family, near Mechanicsburg, Ohio

Date

1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3023

Human Generated Data

Title

The Thaxton family, near Mechanicsburg, Ohio

People

Artist: Ben Shahn, American 1898 - 1969

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3023

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Person 99.8
Human 99.8
Person 99.8
Person 99.6
Person 99.5
Apparel 99
Clothing 99
Person 98.9
Person 96.2
People 89.2
Shoe 87.9
Footwear 87.9
Shoe 84.1
Shoe 83.8
Shoe 77.6
Shoe 74.7
Shoe 67.8
Housing 64.4
Building 64.4
Family 64
Outdoors 63.5
Robe 63.3
Gown 63.3
Evening Dress 63.3
Fashion 63.3
Sleeve 60
Dress 56.5
Female 56

Clarifai
created on 2018-09-18

people 99.9
group 99.3
child 99.2
many 96.9
group together 96.7
wear 95.3
adult 95
home 94.6
offspring 93.7
administration 92.8
several 92.7
family 91
sibling 89.6
woman 87.8
facial expression 86.7
five 84.6
leader 84.4
man 84
boy 83.3
portrait 83

Imagga
created on 2018-09-18

kin 61.8
statue 20.7
people 20.1
man 16.2
world 14.6
male 14.4
sculpture 13.5
ancient 13
portrait 12.9
old 12.5
adult 12.4
child 12.4
travel 12
culture 12
person 11.8
history 11.6
clothing 11.6
face 11.4
mother 11
traditional 10.8
soldier 10.7
art 10.6
uniform 10.3
dress 9.9
women 9.5
historical 9.4
architecture 9.4
monument 9.3
city 9.1
girls 9.1
tourism 9.1
parent 8.9
antique 8.7
stone 8.5
religion 8.1
army 7.8
marble 7.7
military 7.7
fashion 7.5
park 7.4
historic 7.3
lady 7.3
smile 7.1

Google
created on 2018-09-18

Microsoft
created on 2018-09-18

building 99.9
outdoor 98.9
person 98.6
standing 87.9
posing 82.6
black 67.8
dressed 27.8
clothes 21.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 10-15
Gender Female, 99.3%
Disgusted 5.3%
Angry 44.8%
Calm 3.4%
Sad 32.6%
Happy 9.3%
Confused 1.2%
Surprised 3.5%

AWS Rekognition

Age 10-15
Gender Female, 99.7%
Confused 7.8%
Calm 32.1%
Angry 32.1%
Sad 23.6%
Surprised 1%
Disgusted 1.9%
Happy 1.4%

AWS Rekognition

Age 10-15
Gender Female, 98.1%
Calm 8.7%
Confused 4.3%
Surprised 1.5%
Disgusted 3%
Happy 1.9%
Angry 8.3%
Sad 72.5%

AWS Rekognition

Age 20-38
Gender Male, 91.7%
Sad 86.8%
Angry 3.2%
Disgusted 0.9%
Surprised 0.4%
Calm 7.6%
Happy 0.3%
Confused 0.8%

AWS Rekognition

Age 48-68
Gender Female, 94.8%
Angry 12.1%
Disgusted 5.8%
Happy 6.8%
Sad 30.2%
Calm 38.3%
Confused 2.8%
Surprised 4%

AWS Rekognition

Age 49-69
Gender Male, 94.1%
Happy 0.6%
Surprised 1%
Disgusted 3.5%
Confused 2.3%
Angry 14.8%
Calm 61.1%
Sad 16.8%

Microsoft Cognitive Services

Age 15
Gender Male

Microsoft Cognitive Services

Age 50
Gender Male

Microsoft Cognitive Services

Age 25
Gender Female

Microsoft Cognitive Services

Age 48
Gender Female

Microsoft Cognitive Services

Age 30
Gender Female

Microsoft Cognitive Services

Age 10
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 87.9%