Human Generated Data

Title

WOMAN STANDING BY VERANDAH, SMOKING PIPE

Date

Edo period, circa 1765-1770

People

Artist: Suzuki Harunobu 鈴木春信, Japanese 1725-1770

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of the Friends of Arthur B. Duel, 1933.4.2629

Human Generated Data

Title

WOMAN STANDING BY VERANDAH, SMOKING PIPE

People

Artist: Suzuki Harunobu 鈴木春信, Japanese 1725-1770

Date

Edo period, circa 1765-1770

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of the Friends of Arthur B. Duel, 1933.4.2629

Machine Generated Data

Tags

Amazon
created on 2019-07-05

Human 98.2
Person 98.2
Text 77.9
Label 77.9
Art 77.5
Advertisement 76.5
Poster 75.3
Clothing 72.7
Apparel 72.7
People 61.5
Painting 57.2
Paper 55.8

Clarifai
created on 2019-07-05

illustration 98.9
retro 98.7
art 97.2
print 97.2
painting 97.1
vintage 97.1
people 96.5
paper 95.9
wear 95.5
man 93.4
lithograph 91.4
antique 91.2
old 91.1
no person 90.6
adult 90.6
ancient 90.5
manuscript 89.9
one 87.4
bill 87.1
artistic 87

Imagga
created on 2019-07-05

comic book 40.3
vintage 31.5
book jacket 29
old 28.6
paper 26
retro 24.6
ancient 23.3
jacket 22.6
stamp 21.4
card 21.3
money 20.4
cash 20.1
currency 18.8
grunge 18.7
bookmark 18
bank 17.9
decoration 17.6
antique 17.3
wrapping 17.2
letter 16.5
aged 16.3
print media 16.1
financial 16
texture 16
postmark 15.8
mail 15.3
finance 15.2
note 14.7
philately 13.8
envelope 13.6
notes 13.4
banking 12.9
postage 12.8
frame 12.5
covering 12.3
design 12
graffito 11.8
symbol 11.4
pattern 10.9
business 10.9
postal 10.8
pay 10.5
economy 10.2
wealth 9.9
circa 9.9
art 9.8
savings 9.3
wallpaper 9.2
dirty 9
material 8.9
banknotes 8.8
payment 8.7
damaged 8.6
blank 8.6
bill 8.6
dollar 8.4
page 8.3
message 8.2
brown 8.1
banknote 7.8
exchange 7.6
traditional 7.5
document 7.4
style 7.4
drawing 7.2
market 7.1

Google
created on 2019-07-05

Microsoft
created on 2019-07-05

cartoon 99
drawing 98.6
child art 94.4
illustration 89.1
sketch 87.8
painting 81
person 55.8
book 53.1
clothing 50
fabric 10.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Male, 66.4%
Angry 13.7%
Sad 40.8%
Calm 13.9%
Disgusted 7.8%
Happy 11.5%
Surprised 5%
Confused 7.2%

Feature analysis

Amazon

Person 98.2%

Categories

Imagga

paintings art 83.6%
interior objects 15.6%

Captions

Microsoft
created on 2019-07-05

a close up of a logo 61.7%
a close up of a piece of paper 49.7%
a piece of paper 49.6%

Text analysis

Amazon

1
1 Fs gur
Fs gur

Google

鈴本春信画