Human Generated Data

Title

Untitled (deceased baby in knit sweater and hat, Manchester, New Hampshire)

Date

1925, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.366

Human Generated Data

Title

Untitled (deceased baby in knit sweater and hat, Manchester, New Hampshire)

People

Artist: Durette Studio, American 20th century

Date

1925, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Furniture 99.8
Human 97.6
Baby 97.6
Newborn 97.6
Bed 96.4
Cradle 79.8
Mammal 72.2
Cat 72.2
Animal 72.2
Pet 72.2
Person 70
Crib 67.2

Imagga
created on 2021-12-14

lace 30.5
furniture 21.5
sexy 20.1
pillow 19.1
model 18.7
body 18.4
fashion 18.1
person 17.2
adult 16.8
black 16.5
attractive 15.4
bed 15.2
one 14.2
people 13.9
room 12.8
baby bed 12.2
sofa 12.1
face 12.1
skin 11.8
erotic 11.7
currency 11.7
portrait 11.6
legs 11.3
pretty 11.2
home 11.2
cushion 11.2
clothing 11.1
blond 11
bedroom 10.8
cradle 10.6
lady 10.5
lying 10.3
hair 10.3
dollar 10.2
money 10.2
lifestyle 10.1
lingerie 10.1
sensual 10
bank 9.8
interior 9.7
paper 9.7
banknote 9.7
style 9.6
apartment 9.6
finance 9.3
close 9.1
studio 9.1
business 9.1
indoors 8.8
happy 8.8
closeup 8.8
container 8.7
underwear 8.7
couch 8.7
fashionable 8.5
elegance 8.4
house 8.4
banking 8.3
wealth 8.1
posing 8
smile 7.8
eyes 7.7
modern 7.7
child 7.6
cash 7.3
sensuality 7.3
gorgeous 7.2
stylish 7.2
baby 7.2
looking 7.2
financial 7.1
lovely 7.1
love 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.5

Face analysis

Amazon

Google

AWS Rekognition

Age 0-3
Gender Female, 66.4%
Calm 99.6%
Sad 0.2%
Confused 0.1%
Angry 0.1%
Happy 0%
Disgusted 0%
Surprised 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Bed 96.4%
Cat 72.2%
Person 70%

Captions

Microsoft

text 18.3%

Text analysis

Amazon

VOLV
VOLV CYLEJA ЫГИ
CYLEJA
ЫГИ