Human Generated Data

Title

Untitled (baby sitting at table)

Date

1949, made from later copy neg

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18297

Human Generated Data

Title

Untitled (baby sitting at table)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1949, made from later copy neg

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Furniture 99.9
Table 96.5
Human 94.6
Chair 93.3
Person 92.9
Indoors 91
Room 88.1
Baby 88
Desk 86.4
Newborn 80.8
Face 79.4
Portrait 68.4
Photo 68.4
Photography 68.4
Dining Table 67.6
Flooring 65.2
Musical Instrument 59.6
Leisure Activities 59.6
Piano 59.6
Wood 59.2
Cabinet 56.1
Bed 55.9
Sitting 55.5

Imagga
created on 2022-02-25

keyboard instrument 100
upright 100
stringed instrument 100
piano 100
percussion instrument 100
musical instrument 70.1
black 16.2
grand piano 15.6
home 13.6
furniture 13.4
interior 12.4
light 12
room 11.8
people 11.7
old 11.1
business 10.9
dark 10.8
lady 10.5
style 10.4
portrait 10.3
hair 10.3
chair 9.9
table 9.8
man 9.4
happy 9.4
desk 9.3
house 9.2
working 8.8
sitting 8.6
face 8.5
elegance 8.4
fun 8.2
one 8.2
office 8
sexy 8
looking 8
smiling 8
antique 7.8
attractive 7.7
person 7.7
child 7.5
window 7.3
computer 7.2
body 7.2
blond 7.2
adult 7.1
love 7.1
box 7.1
work 7.1

Google
created on 2022-02-25

Furniture 93.3
Table 88
Rectangle 85.8
Toddler 73.1
Monochrome photography 68.7
Baby 67.8
Monochrome 66.8
Vintage clothing 66.2
Sitting 66
Room 65.9
Comfort 63.5
Chair 62.7
Smile 62.1
Darkness 62
Wood 61.9
Art 60.8
Classic 60.3
Suit 59.4
Still life photography 56.6
Child 52.8

Microsoft
created on 2022-02-25

table 94.6
indoor 93.9
text 92.3
black and white 91.9
piano 70
black 65.8
seat 38.5
furniture 17.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 0-3
Gender Male, 94.9%
Surprised 48.6%
Calm 24.6%
Fear 20.8%
Confused 2.3%
Angry 1.2%
Sad 1.1%
Disgusted 0.9%
Happy 0.4%

Microsoft Cognitive Services

Age 0
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 92.9%
Piano 59.6%

Captions

Microsoft

a piano in a room 89.5%
a person sitting on a piano 71.3%
a cat sitting on top of a piano 46.9%

Text analysis

Amazon

TOT
TINY TOT
TINY
YT37A2-XAOOX