Human Generated Data

Title

Untitled (two boys and little girl sitting in living room, next to television)

Date

c.1970, from 1956 negative

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18662

Human Generated Data

Title

Untitled (two boys and little girl sitting in living room, next to television)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

c.1970, from 1956 negative

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Monitor 99.8
Display 99.8
Screen 99.8
Electronics 99.8
Human 98.7
Person 98.7
Footwear 98.2
Shoe 98.2
Clothing 98.2
Apparel 98.2
Person 97.6
Helmet 95.8
Television 94.8
TV 94.8
Person 94.1
Face 77.5
Furniture 70
Kid 67
Child 67
People 65.3
Female 64.8
Photo 63.9
Portrait 63.9
Photography 63.9
Indoors 63.4
Girl 61.6
Baby 59.8
Room 59.4
Chair 57.9
Flooring 55.8

Imagga
created on 2022-03-05

musical instrument 56
accordion 40
keyboard instrument 33.5
wind instrument 28.5
black 16.8
device 15.9
man 15.4
person 15.4
people 15
art 14.6
music 14.1
style 13.3
musician 12.9
dark 12.5
male 12
guitar 11.8
adult 11.7
rock 11.3
grunge 11.1
studio 10.6
stringed instrument 10.6
fashion 10.5
sexy 10.4
singer 10.1
night 9.8
interior 9.7
portrait 9.7
one 9.7
design 9.6
culture 9.4
lifestyle 9.4
model 9.3
modern 9.1
attractive 9.1
old 9
blackboard 8.9
musical 8.6
party 8.6
entertainment 8.3
decoration 8.2
dirty 8.1
symbol 8.1
chair 8
light 8
body 8
room 7.9
equipment 7.9
nightclub 7.8
play 7.7
sport 7.7
elegance 7.6
human 7.5
traditional 7.5
performer 7.4
instrument 7.3
lady 7.3
digital 7.3
sensual 7.3
religion 7.2
bass 7.2
posing 7.1
architecture 7

Microsoft
created on 2022-03-05

text 93.4
black and white 87.1
cartoon 81.3
person 77.9
clothing 71.7

Face analysis

Amazon

AWS Rekognition

Age 31-41
Gender Male, 92.8%
Calm 72.4%
Happy 16.8%
Disgusted 4%
Sad 2.9%
Angry 1.6%
Confused 1%
Surprised 0.9%
Fear 0.5%

AWS Rekognition

Age 43-51
Gender Male, 100%
Surprised 80.7%
Calm 17.6%
Happy 0.4%
Confused 0.3%
Disgusted 0.3%
Sad 0.2%
Angry 0.2%
Fear 0.1%

Feature analysis

Amazon

Person 98.7%
Shoe 98.2%
Helmet 95.8%

Captions

Microsoft

a person standing in front of a window 66.2%
an old photo of a person 66.1%
a person sitting in front of a window 56%

Text analysis

Amazon

ae