Human Generated Data

Title

Mitate of Act Seven from the series Treasury of Loyal Retainers (Chūshingura: Shichi danme)

Date

Late Edo period, dated 1797

People

Artist: Utagawa Toyokuni 歌川豊国, Japanese 1769 - 1825

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Dr. Denman W. Ross, 1917.73

Human Generated Data

Title

Mitate of Act Seven from the series Treasury of Loyal Retainers (Chūshingura: Shichi danme)

People

Artist: Utagawa Toyokuni 歌川豊国, Japanese 1769 - 1825

Date

Late Edo period, dated 1797

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Dr. Denman W. Ross, 1917.73

Machine Generated Data

Tags

Amazon
created on 2019-07-06

Human 88.5
Person 88.5
Person 74.4
Art 72
Person 61.9
Text 59.1
Stocking 58.3
Christmas Stocking 58.3
Gift 58.3
Sewing 57.5

Clarifai
created on 2019-07-06

people 97.6
art 97.3
text 96.1
print 95.4
illustration 94.9
one 94.2
wear 94
design 92.1
man 91.5
woman 91.4
adult 91.1
vertical 89.4
outdoors 89.4
old 89.1
ornate 88.9
portrait 88.9
painting 88.7
fashion 87.9
no person 87.9
symbol 86.2

Imagga
created on 2019-07-06

footwear 40.3
sock 39.9
hosiery 34
clothing 24.3
money 20.4
currency 18.8
boot 18.7
covering 18
art 17.2
cash 16.5
cowboy boot 16.2
finance 16
financial 15.1
paper 15
old 13.9
dollar 13.9
banking 13.8
symbol 13.5
wealth 13.5
black 13.3
vintage 13.2
retro 13.1
sculpture 12.3
business 12.1
savings 12.1
bank 12
shoe 11.8
mosaic 11.1
grunge 11.1
note 11
tile 10.8
bust 10.7
bill 10.5
investment 10.1
stocking 9.9
pattern 9.6
antique 9.5
ancient 9.5
stamp 9.1
design 9
object 8.8
exchange 8.6
card 8.5
guitar 8.3
fashion 8.3
music 8.1
success 8
close 8
boots 7.8
bills 7.8
banknote 7.8
consumer goods 7.6
man 7.5
instrument 7.5
lace 7.5
one 7.5
style 7.4
letter 7.3
device 7.2
detail 7.2
decoration 7.2
body 7.2
male 7.1

Google
created on 2019-07-06

Microsoft
created on 2019-07-06

drawing 97.1
cartoon 96.6
painting 90
sketch 88.3
poster 83.8
art 77.6
book 59.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 54.7%
Angry 45.6%
Sad 46.3%
Confused 45.6%
Surprised 45.4%
Calm 50.9%
Happy 45.5%
Disgusted 45.6%

AWS Rekognition

Age 26-43
Gender Male, 54.9%
Sad 45.3%
Confused 46.1%
Happy 47.4%
Surprised 46.2%
Calm 47.1%
Disgusted 47.1%
Angry 45.8%

AWS Rekognition

Age 20-38
Gender Male, 53.6%
Surprised 45.5%
Confused 46.3%
Angry 45.6%
Disgusted 46%
Happy 45.3%
Sad 45.4%
Calm 50.8%

Feature analysis

Amazon

Person 88.5%

Categories

Imagga

pets animals 91.4%
paintings art 4.7%
food drinks 1.1%

Captions

Microsoft
created on 2019-07-06

a close up of a piece of paper 71.3%
a close up of a logo 71.2%
a piece of paper 70.8%