Human Generated Data

Title

Untitled (two children holding presents and sitting under Christmas tree with creche)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3668

Human Generated Data

Title

Untitled (two children holding presents and sitting under Christmas tree with creche)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3668

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Tree 99.6
Plant 99.6
Person 99.4
Human 99.4
Person 93.3
Ornament 81.3
Animal 80
Bird 80
Christmas Tree 76
Vegetation 64.1
Person 59.7
Room 56.9
Indoors 56.9
Person 44.5

Clarifai
created on 2019-06-01

people 99.9
adult 99
one 97.4
wear 97
two 96
group 94.3
woman 93.5
man 92
child 91.3
veil 85.5
sit 84.7
art 79.9
music 79.5
outfit 79.1
home 78.5
group together 78.3
portrait 75.9
administration 75.3
furniture 74.5
leader 72.8

Imagga
created on 2019-06-01

negative 30.9
grunge 30.7
old 27.9
vintage 25.6
film 24.4
dirty 21.7
antique 20.8
art 19.6
retro 18.8
texture 18.8
picket fence 17.4
wall 17.4
aged 16.3
photographic paper 15.9
grungy 15.2
musical instrument 14.9
ancient 14.7
fence 14.6
black 14.4
rough 12.8
pattern 12.3
structure 11.7
material 11.6
textured 11.4
building 11.3
decoration 11.1
brown 11
paint 10.9
border 10.9
man 10.8
photographic equipment 10.6
barrier 10.4
frame 10
damaged 9.5
weathered 9.5
paper 9.5
color 9.5
historical 9.4
architecture 9.4
space 9.3
male 9.2
design 9
detail 8.9
mask 8.8
graphic 8.8
text 8.7
person 8.7
rust 8.7
accordion 8.5
history 8.1
surface 7.9
face 7.8
people 7.8
cold 7.8
messy 7.7
aging 7.7
wind instrument 7.6
rusty 7.6
canvas 7.6
style 7.4
park 7.4
historic 7.3
danger 7.3
stone 7.3
dress 7.2

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

black and white 78.2
person 69.6
clothing 67.2
old 43.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Female, 51.7%
Happy 0.9%
Disgusted 1%
Angry 3.1%
Calm 77.2%
Sad 12.9%
Confused 2.1%
Surprised 2.9%

AWS Rekognition

Age 20-38
Gender Female, 89.8%
Sad 12.6%
Angry 7.1%
Happy 20.5%
Confused 10%
Calm 38.8%
Surprised 8.2%
Disgusted 2.6%

Feature analysis

Amazon

Person 99.4%
Bird 80%

Captions