Human Generated Data

Title

Untitled (Mulhall family, sharecroppers, Ozarks, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1137

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Mulhall family, sharecroppers, Ozarks, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1137

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Art 100
Painting 100
Face 99.9
Head 99.9
Photography 99.9
Portrait 99.9
Person 99.2
Child 99.2
Female 99.2
Girl 99.2
Person 98.9
Adult 98.9
Male 98.9
Man 98.9
Person 98.8
Female 98.8
Adult 98.8
Woman 98.8
Person 95.4
Baby 95.4
Person 91.8
Female 91.8
Adult 91.8
Woman 91.8
Lady 57.8
Clothing 57.2
Dress 57.2
People 55.9
Drawing 55.1

Clarifai
created on 2018-05-11

people 100
adult 99.5
group 99.3
woman 98.3
man 98.1
wear 97.5
two 97.5
child 97.5
music 97.5
portrait 96.7
dancing 95.4
musician 93.8
actress 93
three 92.8
art 92.7
print 91.9
veil 90.4
dancer 90
outfit 89.9
family 88.7

Imagga
created on 2023-10-06

world 30.3
kin 21.7
person 21.1
people 19
adult 17.7
black 17
one 15.7
portrait 14.9
dress 14.5
man 13.4
old 13.2
vintage 13.2
art 13.2
newspaper 13.1
style 12.6
posing 12.4
fashion 12.1
sensuality 11.8
mother 11.7
dark 11.7
symbol 11.4
clothing 11
product 10.7
sexy 10.4
love 10.3
model 10.1
water 10
male 10
face 9.9
dance 9.8
human 9.7
lady 9.7
creation 9.5
expression 9.4
elegance 9.2
attractive 9.1
body 8.8
happy 8.8
lifestyle 8.7
bride 8.6
hair 7.9
jacket 7.9
postmark 7.9
stamp 7.7
pretty 7.7
mail 7.7
book jacket 7.7
energy 7.6
retro 7.4
light 7.4
performer 7.3
office 7.2
fitness 7.2
religion 7.2
parent 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

text 99.9
book 94.7
posing 79.8
old 72.5
image 31.1
vintage 28.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-16
Gender Female, 99.2%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 1.7%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%
Happy 0%

AWS Rekognition

Age 2-10
Gender Female, 99.9%
Sad 98.7%
Angry 31.5%
Calm 8.5%
Surprised 6.4%
Fear 6.2%
Disgusted 0.9%
Happy 0.7%
Confused 0.3%

AWS Rekognition

Age 0-3
Gender Female, 61.7%
Calm 91.7%
Surprised 6.4%
Fear 6.3%
Sad 4.9%
Confused 0.3%
Angry 0.3%
Disgusted 0.2%
Happy 0.1%

Microsoft Cognitive Services

Age 23
Gender Female

Microsoft Cognitive Services

Age 35
Gender Female

Microsoft Cognitive Services

Age 4
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Child 99.2%
Female 99.2%
Girl 99.2%
Adult 98.9%
Male 98.9%
Man 98.9%
Woman 98.8%
Baby 95.4%

Categories