Human Generated Data

Title

Untitled (Mulhall family, Ozarks, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3591

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Mulhall family, Ozarks, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3591

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Face 100
Head 100
Photography 100
Portrait 100
People 99.8
Person 99.4
Adult 99.4
Female 99.4
Woman 99.4
Person 99
Baby 99
Person 98.2
Baby 98.2
Clothing 89.4
Dress 89.4
Wood 77.4
Happy 57.4
Smile 57.4
Lady 57.2
Window 57
Blouse 56.3
Accessories 56
Earring 56
Jewelry 56
Newborn 55.6

Clarifai
created on 2018-05-10

people 100
child 99.8
two 99
offspring 98.6
adult 98.4
group 97.8
baby 97.7
portrait 97.6
actress 97.5
wear 97.4
son 97.2
sibling 95.6
three 94.7
family 91.1
administration 90.9
facial expression 89.8
music 89.8
woman 89.7
interaction 87.5
profile 86.2

Imagga
created on 2023-10-06

sculpture 27.8
world 27.6
statue 26.8
parent 25.5
mother 22.2
kin 20.5
ancient 19
old 18.8
dad 17.5
man 17.5
people 17.3
person 17
portrait 16.8
art 15.7
marble 15.5
male 15.3
body 15.2
human 15
architecture 14.8
stone 14.3
face 14.2
love 14.2
father 13.8
adult 13.6
religion 13.4
history 12.5
child 11.7
vintage 11.6
sexy 11.2
religious 11.2
culture 11.1
model 10.9
black 10.8
fan 10.6
bride 10.6
one 10.5
couple 10.5
antique 10.4
building 10.4
monument 10.3
sensuality 10
attractive 9.8
lady 9.7
hand 9.1
fashion 9
dress 9
sepia 8.7
lifestyle 8.7
follower 8.6
travel 8.4
wedding 8.3
figure 8.2
decoration 8
death 7.7
wall 7.7
historical 7.5
city 7.5
historic 7.3
girls 7.3
fitness 7.2
cemetery 7.2
posing 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.3
outdoor 98.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-51
Gender Female, 99.5%
Sad 95.8%
Calm 34.3%
Fear 7.2%
Surprised 6.7%
Confused 5.1%
Disgusted 3.3%
Angry 2.3%
Happy 2.2%

AWS Rekognition

Age 4-10
Gender Female, 56.4%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 0.5%
Confused 0.2%
Angry 0.1%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 0-4
Gender Female, 99.6%
Calm 77.7%
Surprised 9.8%
Fear 7.4%
Sad 4.9%
Confused 4.4%
Happy 1%
Angry 0.8%
Disgusted 0.7%

AWS Rekognition

Age 6-14
Gender Female, 99.8%
Sad 100%
Fear 11.1%
Surprised 6.7%
Calm 2.5%
Confused 0.7%
Disgusted 0.3%
Angry 0.2%
Happy 0.1%

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 6
Gender Male

Microsoft Cognitive Services

Age 52
Gender Male

Microsoft Cognitive Services

Age 5
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Adult 99.4%
Female 99.4%
Woman 99.4%
Baby 99%

Categories

Imagga

people portraits 92.2%
paintings art 7.3%

Text analysis

Amazon

133%F.