Human Generated Data

Title

Dressing Station

Date

1916

People

Artist: Christopher R. W. Nevinson, British 1889 - 1946

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of James N. Rosenberg, M4543

Human Generated Data

Title

Dressing Station

People

Artist: Christopher R. W. Nevinson, British 1889 - 1946

Date

1916

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of James N. Rosenberg, M4543

Machine Generated Data

Tags

Amazon
created on 2019-08-10

Painting 98.6
Art 98.6
Human 98.6
Person 98.6
Person 98.4
Person 95.6
Person 93
Person 91.4
Person 78.3
Poster 65.2
Advertisement 65.2
Person 43.6

Clarifai
created on 2019-08-10

people 99.9
print 99.9
illustration 99.6
art 99.3
man 98.8
group 98.7
adult 98.5
furniture 96.7
portrait 96.3
picture frame 95.4
one 95.2
painting 95.2
engraving 94.9
two 94.3
child 93.7
woman 93.3
wear 91.8
woodcut 90.8
antique 90.2
book bindings 88.7

Imagga
created on 2019-08-10

sketch 45.8
drawing 38.7
representation 28.1
art 21.9
architecture 18.1
bedroom 17.5
room 17.4
design 17
old 16.7
sculpture 16
panel 15.7
decoration 14.7
religion 14.3
church 13.9
currency 13.4
antique 13.1
detail 12.9
money 12.8
pillow 12.5
statue 12.4
window 12.2
culture 12
banking 11.9
bank 11.6
history 11.6
furniture 11.6
dollar 11.1
finance 11
house 10.9
symbol 10.8
vintage 10.7
building 10.6
style 10.4
home 10.4
business 10.3
religious 10.3
famous 10.2
paper 10.2
cash 10.1
decorative 10
retro 9.8
interior 9.7
quilt 9.6
black 9.6
god 9.6
exchange 9.5
bill 9.5
historic 9.2
wealth 9
sofa 8.9
decor 8.8
artistic 8.7
holy 8.7
ancient 8.6
spiritual 8.6
travel 8.4
savings 8.4
investment 8.2
pattern 8.2
landmark 8.1
frame 8
product 7.8
bed 7.8
bedclothes 7.7
architectural 7.7
grunge 7.7
creation 7.6
temple 7.6
city 7.5
monument 7.5
structure 7.4
artwork 7.3
paint 7.2
covering 7.2
colorful 7.2
studio couch 7.1
financial 7.1
face 7.1
market 7.1
wall 7.1
carving 7
modern 7

Google
created on 2019-08-10

Microsoft
created on 2019-08-10

clothing 98
person 97.9
text 97.9
drawing 83.5
man 80.9
sketch 61.7
old 58.4
cartoon 53.5
posing 35.1
painting 31.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 1-7
Gender Female, 65.3%
Calm 15.2%
Confused 1.9%
Surprised 13.7%
Sad 0.7%
Happy 8.1%
Fear 3.2%
Disgusted 7.3%
Angry 49.8%

AWS Rekognition

Age 20-32
Gender Male, 54.6%
Angry 49.8%
Fear 45%
Confused 45%
Calm 49.8%
Happy 45%
Disgusted 45%
Surprised 45.1%
Sad 45.2%

AWS Rekognition

Age 26-40
Gender Female, 51%
Fear 45.1%
Surprised 45.2%
Calm 48.9%
Disgusted 45.6%
Sad 47.1%
Angry 47.4%
Happy 45.6%
Confused 45.1%

AWS Rekognition

Age 14-26
Gender Female, 52.8%
Confused 45.1%
Angry 48.9%
Surprised 45.1%
Fear 45.9%
Sad 47.5%
Calm 47.4%
Disgusted 45%
Happy 45%

Feature analysis

Amazon

Person 98.6%
Poster 65.2%

Categories

Imagga

paintings art 100%

Captions

Text analysis

Amazon

1916
in
uom
R.w.Nwi
Nww
R.w.Nwi w Nww in
uom lihn yaL
yaL
lihn
w

Google

Rw.Nwn 916
Rw.Nwn
916