Human Generated Data

Title

Untitled (women stand next to Christmas tree with presents)

Date

1936

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1493

Human Generated Data

Title

Untitled (women stand next to Christmas tree with presents)

People

Artist: Durette Studio, American 20th century

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1493

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Plant 99.6
Tree 99.6
Human 99.2
Person 99.2
Person 98.1
Ornament 95.4
Christmas Tree 88.6
Person 63.7

Clarifai
created on 2019-06-01

people 98.6
furniture 96.3
adult 95.8
indoors 95.1
woman 94.7
room 94
man 93.6
illustration 92
group 90.8
mirror 90.7
monochrome 90.7
window 87.7
design 87.1
chandelier 86.4
inside 83.8
table 82.8
chair 82.5
house 82.5
exhibition 82.2
architecture 81.9

Imagga
created on 2019-06-01

architecture 27.9
building 26.8
window 24.6
house 23.4
old 23
wall 18.8
negative 17.6
home 17.5
city 17.5
historic 16.5
film 15.2
ancient 14.7
stone 14.6
history 14.3
room 14.3
washbasin 14.2
vintage 14.1
tourism 14
door 13.7
art 13.4
interior 13.3
travel 12
street 12
frame 11.9
structure 11.7
windows 11.5
shop 11.5
decoration 11.4
basin 11.3
historical 11.3
town 11.1
antique 10.9
architectural 10.6
glass 10.4
culture 10.3
photographic paper 10.2
landmark 9.9
destination 9.3
marble 9.2
sculpture 9.2
dirty 9
texture 9
vessel 9
detail 8.8
holiday 8.6
grunge 8.5
balcony 8.4
monument 8.4
facade 8.2
retro 8.2
aged 8.1
brown 8.1
family 8
urban 7.9
blackboard 7.7
mercantile establishment 7.6
buildings 7.6
famous 7.4
design 7.3
people 7.2
black 7.2
religion 7.2

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

indoor 92.6
vase 90.2
window 87.1
christmas tree 81.5
house 79.7
furniture 69.8
table 68.6
white 62.6
black and white 55.4
picture frame 55.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 45-63
Gender Male, 50.4%
Sad 46.1%
Happy 46.7%
Confused 45.3%
Angry 45.6%
Surprised 45.5%
Disgusted 45.1%
Calm 50.7%

AWS Rekognition

Age 35-52
Gender Male, 52.9%
Disgusted 45.3%
Confused 45.3%
Angry 45.4%
Calm 50.3%
Surprised 45.5%
Happy 47.4%
Sad 45.8%

AWS Rekognition

Age 26-43
Gender Female, 53.4%
Angry 45.3%
Surprised 45.1%
Disgusted 45.2%
Sad 53.6%
Calm 45.3%
Happy 45.4%
Confused 45.1%

Feature analysis

Amazon

Person 99.2%

Categories