Human Generated Data

Title

Untitled (man holding boy next to Christmas tree)

Date

1936

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1491

Human Generated Data

Title

Untitled (man holding boy next to Christmas tree)

People

Artist: Durette Studio, American 20th century

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1491

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Plant 99.5
Tree 99.5
Person 99.2
Human 99.2
Ornament 88
Christmas Tree 80.9
Person 72.4
Room 71.6
Indoors 71.6
Person 69.8
Interior Design 59.4

Clarifai
created on 2019-06-01

people 98.8
furniture 98.7
indoors 98.1
room 97.4
adult 95.2
mirror 95.1
window 94.3
woman 91.3
no person 90.2
seat 89.7
man 89.6
chair 89.4
home 89.2
monochrome 89
luxury 88.5
house 87.1
group 86.2
exhibition 85.1
chandelier 84.7
decoration 84.6

Imagga
created on 2019-06-01

door 40.8
house 32.6
sliding door 30.9
home 28.7
interior 27.4
architecture 25.3
window 25.1
room 23.6
glass 18.9
wall 18.8
movable barrier 18.5
light 17.4
curtain 16.4
building 16
furniture 15.2
modern 14.7
old 14.6
shower curtain 14.5
design 13.5
barrier 13.5
decoration 13.2
indoors 13.2
bathroom 13.2
sketch 12.5
frame 11.9
furnishing 11.5
windows 10.6
blind 10.5
estate 10.4
luxury 10.3
decor 9.7
inside 9.2
lamp 8.9
protective covering 8.8
windowsill 8.8
residential 8.6
contemporary 8.5
travel 8.4
elegance 8.4
clean 8.3
vintage 8.3
shower 8.2
art 8.1
stone 8.1
water 8
representation 8
flowers 7.8
residence 7.8
ancient 7.8
flower 7.7
real 7.6
antique 7.6
city 7.5
floor 7.4
town 7.4
balcony 7.3
sill 7.2
drawing 7.1
table 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 48-68
Gender Male, 54%
Angry 45.6%
Happy 48%
Calm 48.6%
Surprised 46%
Sad 45.8%
Confused 45.3%
Disgusted 45.7%

AWS Rekognition

Age 30-47
Gender Male, 50.1%
Surprised 49.9%
Sad 49.8%
Angry 49.6%
Disgusted 49.5%
Calm 49.6%
Happy 49.5%
Confused 49.6%

Feature analysis

Amazon

Person 99.2%

Categories