Human Generated Data

Title

Untitled (Susanna Shahn)

Date

Spring 1937

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4379.5

Human Generated Data

Title

Untitled (Susanna Shahn)

People

Artist: Ben Shahn, American 1898 - 1969

Date

Spring 1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4379.5

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Crib 99.9
Furniture 99.9
Infant Bed 99.9
Baby 99.4
Person 99.4
Face 98
Head 98
Plant 91.1
Bed 67.6
Handrail 55.7
Cradle 55.5

Clarifai
created on 2018-05-09

people 99.9
child 99.8
baby 98.9
one 97.8
portrait 96.9
administration 92.9
innocence 92.5
adult 91
monochrome 90.2
infancy 89.7
offspring 89.4
indoors 88.3
family 88.2
sit 87.8
wear 86.2
son 86.1
two 85.3
furniture 82.6
newborn 78
facial expression 77.8

Imagga
created on 2023-10-07

baby bed 100
cradle 98.6
furniture 75
child 71.9
furnishing 51.1
kid 47
baby 44.2
childhood 40.3
little 39.8
cute 36.6
boy 31.3
mother 31.2
family 30.3
happy 28.8
children 27.3
parent 26.5
happiness 25.9
toddler 25.8
fun 25.5
infant 25.1
face 24.9
daughter 24
bassinet 23.8
portrait 23.3
play 22.4
smile 21.4
love 21.3
newborn 20.4
adorable 20.3
joy 20.1
cheerful 19.5
innocent 19.4
innocence 19.2
playing 19.2
girls 19.1
eyes 18.9
care 18.9
people 18.4
smiling 18.1
vessel 17
kids 17
person 16.7
youth 16.2
sweet 15
son 14.5
one 14.2
expression 13.7
looking 12.8
life 12.5
healthy 12
lifestyle 11.6
together 11.4
human 11.3
male 11.1
joyful 11
father 11
tub 10.8
bucket 10.7
home 10.4
sitting 10.3
hair 10.3
toy 10.1
blond 9.9
hand 9.9
health 9.7
container 9.2
old 9.1
parenthood 8.8
man 8.7
boys 8.7
bed 8.5
pretty 8.4
funny 8.3
outdoors 8.2
sister 7.8
attractive 7.7
enjoying 7.6
playful 7.6
enjoyment 7.5
leisure 7.5
park 7.4

Google
created on 2018-05-09

Microsoft
created on 2018-05-09

baby 66.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 0-3
Gender Female, 50.4%
Happy 62.1%
Calm 8.6%
Angry 7.7%
Surprised 7.3%
Fear 7%
Confused 5.8%
Disgusted 5.8%
Sad 4.4%

Feature analysis

Amazon

Baby
Person
Plant
Baby 99.4%

Captions

Microsoft
created on 2018-05-09

a baby sitting in a bowl 72.9%
a baby sitting on a table 64.5%
a person holding a baby 64.4%

Text analysis

Amazon

inverted
and
fm
Art
University
College
(Harvard
neg
of
President and Fellows of Harvard College (Harvard University Art Museums)
Fellows
Harvard
Museums)
P1970.4379.0005 inverted fm neg
President
P1970.4379.0005

Google

O President and Fellows of Harvard College (Harvard University Art Museums) P1970.4379.0005 inverted fm neg
O
President
and
Fellows
College
(Harvard
University
Museums)
P1970.4379.0005
of
Harvard
Art
inverted
fm
neg