Human Generated Data

Title

Untitled (lace drape over coffin)

Date

c. 1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.370

Human Generated Data

Title

Untitled (lace drape over coffin)

People

Artist: Durette Studio, American 20th century

Date

c. 1915, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.370

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Furniture 100
Cradle 97.8
Person 95.2
Human 95.2

Clarifai
created on 2023-10-15

portrait 99.6
people 97.7
baby 97.5
one 97.3
art 95.9
window 94.6
adult 93.9
painting 93.8
monochrome 93.4
woman 93.2
family 92.5
retro 91.7
religion 90.8
old 90.7
vintage 90.2
girl 88.7
empty 88.5
child 88.3
wood 88.1
museum 87.8

Imagga
created on 2021-12-14

windowsill 40.1
hole 32.3
sill 32.1
cradle 25.7
structural member 25.1
support 22.2
furniture 21.6
baby bed 21.5
sculpture 17.7
furnishing 17.6
old 17.4
device 14.9
architecture 14.1
black 13.2
history 12.5
building 11.9
statue 11.4
face 11.3
art 10.6
ancient 10.4
monument 10.3
protective covering 10.2
love 10.2
stone 10.1
dress 9.9
portrait 9.7
eyes 9.5
culture 9.4
historic 9.2
window screen 8.7
man 8.7
light 8.7
famous 8.4
dark 8.3
city 8.3
tourism 8.2
wall 8.1
hair 7.9
window 7.8
people 7.8
male 7.8
antique 7.8
adult 7.7
person 7.7
historical 7.5
wedding 7.3
religion 7.2

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 98.4
human face 92.7
art 84.2
person 84.1
black and white 77.1
white 72.5
wedding dress 70.1
painting 70.1
drawing 68.4
sketch 67.3
old 62
woman 58.7
clothing 53.5
picture frame 12.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 49-67
Gender Female, 97.1%
Calm 97.7%
Sad 0.8%
Happy 0.5%
Angry 0.3%
Fear 0.2%
Surprised 0.2%
Disgusted 0.1%
Confused 0.1%

Microsoft Cognitive Services

Age 38
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.2%

Categories

Captions

Microsoft
created on 2021-12-14

an old photo of a person 52.8%
an old photo of a television 26.6%
an old photo 26.5%