Human Generated Data

Title

Untitled (rehabilitation client, Maria Plantation, Ozarks, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1592

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (rehabilitation client, Maria Plantation, Ozarks, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1592

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Face 100
Head 100
Photography 100
Portrait 100
Wood 99.9
Clothing 99.2
Dress 99.2
Person 99.1
Child 99.1
Female 99.1
Girl 99.1
Person 98.6
Child 98.6
Boy 98.6
Male 98.6
Person 98.6
Baby 98.6
Person 98.2
Male 98.2
Adult 98.2
Man 98.2
Pants 97.7
Shorts 91.5
People 91.3
Coat 85
Outdoors 76.2
Barefoot 57.7
Door 56.6
Animal 56.3
Mammal 56.3
Skirt 56.2
Nature 56.2
Pet 56.2
Hat 55.8
Body Part 55.2
Finger 55.2
Hand 55.2

Clarifai
created on 2018-05-11

people 100
child 99.6
group 99.5
two 98.5
three 98.2
offspring 98.2
adult 97.7
four 97.6
wear 97.1
sibling 96.2
woman 95.3
several 95.1
group together 94
five 93.1
administration 92.3
actress 89.7
son 89.7
family 89.3
man 89
outfit 88.4

Imagga
created on 2023-10-05

kin 20.1
portrait 20.1
statue 19.6
child 17.9
male 17.3
people 17.3
man 16.8
couple 16.6
happy 16.3
person 15.9
sculpture 15.5
old 15.3
love 14.2
world 13.6
marble 13.6
happiness 13.3
adult 13.1
smiling 13
culture 12.8
art 12.5
architecture 12.5
mother 12.3
monument 12.1
ancient 12.1
dress 11.7
religion 11.6
history 11.6
bride 11.5
women 11.1
traditional 10.8
face 10.7
outdoors 10.4
wedding 10.1
historic 10.1
house 10
smile 10
sibling 9.9
fashion 9.8
family 9.8
groom 9.6
home 9.6
two 9.3
stone 9.3
lady 8.9
together 8.8
decoration 8.7
travel 8.4
pretty 8.4
human 8.2
tourism 8.2
sexy 8
romantic 8
look 7.9
standing 7.8
parent 7.7
bouquet 7.5
historical 7.5
religious 7.5
vintage 7.4
famous 7.4
style 7.4
girls 7.3
daughter 7.3
lifestyle 7.2
landmark 7.2
black 7.2
holiday 7.2
column 7.1
interior 7.1
day 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.9
outdoor 98.8
standing 83.9
black 70.5
white 70
posing 69

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 54-64
Gender Male, 92%
Calm 99.5%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.1%
Angry 0.1%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 1-7
Gender Male, 55.4%
Sad 100%
Surprised 6.3%
Fear 5.9%
Angry 4.8%
Calm 0.6%
Confused 0.3%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 6-14
Gender Female, 71.7%
Sad 98.4%
Calm 23.5%
Angry 9%
Surprised 9%
Fear 7%
Disgusted 2.7%
Confused 1.3%
Happy 0.6%

AWS Rekognition

Age 35-43
Gender Female, 100%
Calm 51.2%
Fear 23%
Surprised 19%
Angry 3.3%
Sad 2.9%
Happy 2.4%
Disgusted 1.7%
Confused 1.4%

AWS Rekognition

Age 0-4
Gender Male, 68.3%
Calm 87.9%
Surprised 7.5%
Fear 6%
Sad 3.7%
Angry 3.4%
Confused 1%
Disgusted 0.8%
Happy 0.3%

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 40
Gender Male

Microsoft Cognitive Services

Age 5
Gender Female

Microsoft Cognitive Services

Age 5
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Child 99.1%
Female 99.1%
Girl 99.1%
Boy 98.6%
Male 98.6%
Baby 98.6%
Adult 98.2%
Man 98.2%
Coat 85%

Categories

Imagga

paintings art 93.6%
people portraits 6.3%

Text analysis

Amazon

27

Google

7
7