Human Generated Data

Title

Minamoto no Yorimasa and Tamamo no Mae

Date

Edo period,

People

Artist: Yashima Gakutei 八島岳亭, Japanese 1786? - 1868

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of the Friends of Arthur B. Duel, 1933.4.1724

Human Generated Data

Title

Minamoto no Yorimasa and Tamamo no Mae

People

Artist: Yashima Gakutei 八島岳亭, Japanese 1786? - 1868

Date

Edo period,

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of the Friends of Arthur B. Duel, 1933.4.1724

Machine Generated Data

Tags

Amazon
created on 2019-07-06

Art 90.5
Person 85.3
Human 85.3
Painting 59.6
Tile 57.8
Ornament 57
Tapestry 57
Mosaic 55.9

Clarifai
created on 2019-07-06

art 99.7
painting 99.4
illustration 99.3
religion 96.9
ancient 96.8
old 96.5
culture 93.1
antique 92.9
god 92.2
decoration 91.1
print 89.9
people 89.6
museum 89.3
vintage 88.1
religious belief 87.6
Gothic 87.3
wall 86.1
symbol 86
crown 85.2
mural 84.4

Imagga
created on 2019-07-06

mosaic 80.1
transducer 52.3
electrical device 39.2
art 30.2
bib 26.7
device 25.3
old 24.4
vintage 24
retro 19.7
paper 19.6
pattern 18.5
decoration 16.8
texture 15.3
design 15.3
gold 14
antique 13.9
grunge 13.6
instrumentality 13.1
ancient 13
card 11.9
finance 11.8
cotton 11.7
colorful 11.5
culture 11.1
money 11.1
banking 11
note 11
cash 11
decorative 10.9
traditional 10.8
wallpaper 10.7
color 10.6
golden 10.3
symbol 10.1
letter 10.1
holiday 10
currency 9.9
travel 9.9
stamp 9.7
artistic 9.6
temple 9.5
wall 9.4
religion 9
bank 9
map 8.9
apron 8.9
postmark 8.9
decor 8.8
notes 8.6
ornament 8.6
mail 8.6
yellow 8.6
tourism 8.2
jigsaw puzzle 8.2
paint 8.1
painting 8.1
brown 8.1
object 8.1
history 8
fabric 8
icon 7.9
architecture 7.8
banknote 7.8
creation 7.7
post 7.6
backdrop 7.4
detail 7.2
protective garment 7.2
bookmark 7.1
world 7.1

Google
created on 2019-07-06

Microsoft
created on 2019-07-06

cartoon 97.6
text 97
painting 96.8
book 94.8
drawing 94
indoor 85
person 78.2
clothing 71.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 20-38
Gender Female, 81.1%
Angry 4.1%
Confused 2.3%
Disgusted 3.2%
Happy 4.4%
Sad 20.3%
Surprised 3.9%
Calm 61.7%

AWS Rekognition

Age 35-55
Gender Female, 99.1%
Sad 7.9%
Angry 2.8%
Disgusted 0.9%
Surprised 73%
Calm 7.3%
Happy 1.4%
Confused 6.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 85.3%
Painting 59.6%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2019-07-06

a book on a blanket 28.6%

Text analysis

Amazon

4aL