Human Generated Data

Title

The Invalid

Date

c. 1868-1870

People

Artist: Edward Lamson Henry, American 1841 - 1919

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Daniel A. Pollack, Class of 1960, American Art Acquisition Fund, 2005.189

Human Generated Data

Title

The Invalid

People

Artist: Edward Lamson Henry, American 1841 - 1919

Date

c. 1868-1870

Classification

Paintings

Machine Generated Data

Tags

Amazon
created on 2019-04-05

Painting 99.7
Art 99.7
Person 87.6
Human 87.6
Wood 70.6
Bed 60.3
Furniture 60.3

Clarifai
created on 2018-04-19

furniture 99.7
room 99.7
people 99.3
bed 98.8
indoors 98.8
bedroom 98.7
adult 98.4
seat 98.1
painting 96.8
one 96.7
home 95.6
art 93.6
house 91.7
woman 90.9
interior design 90.6
two 90.4
wear 89.7
reclining 89.5
table 87.8
family 87.7

Imagga
created on 2018-04-19

mosquito net 80.8
protective covering 49.4
room 37.8
bed 36.8
covering 34.8
furniture 34.7
interior 30.1
bedroom 26.8
luxury 24
home 22.3
covered couch 20.3
four-poster 19.6
hotel 19.1
house 17.6
dark 16.7
litter 16.6
fashion 15.8
sofa 15.4
lamp 14.8
cadaver 14.3
style 14.1
design 14.1
modern 14
pillow 13.9
light 13.4
old 13.2
wall 13.2
person 12.9
conveyance 12.5
comfortable 12.4
decor 12.4
table 12.2
travel 12
inside 12
elegance 11.8
bedroom furniture 11.7
wood 11.7
adult 11.6
indoors 11.4
model 10.9
decoration 10.9
pillows 10.8
cozy 10.8
sleep 10.7
night 10.7
rest 10.5
sexy 10.4
body 10.4
seat 10.3
architecture 10.2
motel 9.9
luxurious 9.7
ancient 9.5
people 9.5
passion 9.4
relaxation 9.2
vintage 9.1
attractive 9.1
sensuality 9.1
tourist 9.1
couch 8.7
skin 8.5
stone 8.4
religious 8.4
portrait 8.4
floor 8.4
gold 8.2
one 8.2
domestic 8.1
lady 8.1
water 8
hair 7.9
mattress 7.9
love 7.9
suite 7.9
accommodation 7.9
linen 7.9
king 7.8
tour 7.7
residential 7.7
relax 7.6
erotic 7.6
man 7.4
chair 7.4
furnishing 7.4
sensual 7.3
dress 7.2
black 7.2
religion 7.2
smile 7.1

Google
created on 2018-04-19

Microsoft
created on 2018-04-19

floor 94.4
indoor 92.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 88.2%
Confused 0.9%
Disgusted 2%
Surprised 1.5%
Calm 69.2%
Happy 16.3%
Sad 7.7%
Angry 2.4%

AWS Rekognition

Age 38-59
Gender Male, 51.1%
Calm 47.7%
Disgusted 45.4%
Surprised 45.3%
Happy 45.2%
Sad 49%
Angry 47%
Confused 45.4%

Microsoft Cognitive Services

Age 39
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 99.7%
Person 87.6%
Bed 60.3%

Captions

Microsoft

a person standing next to a fireplace 75.2%
a person standing next to a fireplace 68.4%
a person standing in front of a fireplace 68.3%