Human Generated Data

Title

Courtesan and Dog

Date

Late Edo to Early Meiji period, mid to late 19th century

People

Artist: Utagawa Kuniteru, Japanese 1808 - 1876

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Mrs. Jerome I. H. Downes, 1967.61

Human Generated Data

Title

Courtesan and Dog

People

Artist: Utagawa Kuniteru, Japanese 1808 - 1876

Date

Late Edo to Early Meiji period, mid to late 19th century

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Mrs. Jerome I. H. Downes, 1967.61

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Person 96.2
Human 96.2
Art 91.1
Drawing 83.5
Leisure Activities 74.8
Text 70.9
Performer 67.3
Sketch 63.3
Photography 60.4
Photo 60.4
Dance Pose 57.7
Poster 57.6
Advertisement 57.6
Painting 56.6
Dance 55.5

Clarifai
created on 2020-04-24

people 99.3
one 98.5
art 94.8
portrait 92.3
music 91.3
adult 89.7
man 88.8
retro 87.8
woman 86.9
wear 85.1
child 79.3
painting 76.5
theater 74.4
print 72.2
vintage 71
illustration 69.9
administration 69.5
old 68.5
two 67.5
museum 66.4

Imagga
created on 2020-04-24

bookmark 46.2
book jacket 31.5
jacket 26.4
vintage 22.3
art 19.9
wrapping 18.6
black 16.3
covering 15.8
culture 15.4
old 14.6
grunge 13.6
stamp 13.5
symbol 13.5
envelope 13.4
mail 13.4
antique 13.2
portrait 12.9
postmark 12.8
postage 12.8
retro 12.3
letter 11.9
postal 10.8
silhouette 10.8
painted 10.5
post 10.5
binding 10.4
ancient 10.4
man 10.1
global 10
aged 9.9
male 9.9
fame 9.9
renaissance 9.8
paintings 9.8
painter 9.4
paper 9.4
model 9.3
face 9.2
fashion 9
dress 9
one 9
masterpiece 8.9
design 8.9
known 8.9
printed 8.8
closeup 8.8
icon 8.7
fine 8.6
unique 8.5
drawing 8.3
blackboard 8.2
statue 8.2
style 8.2
paint 8.1
door 8.1
history 8
sexy 8
post mail 7.9
zigzag 7.9
delivery 7.8
museum 7.8
cutting 7.7
wall 7.7
communications 7.7
texture 7.6
temple 7.6
historical 7.5
sport 7.4
artwork 7.3
detail 7.2
office 7.2

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

text 98.3
drawing 98.2
sketch 97.1
cartoon 75.9
clothing 74.5
person 73.9
white 62.2
old 44.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-38
Gender Male, 54.2%
Sad 45.6%
Surprised 50.7%
Calm 47.7%
Angry 45.1%
Fear 45.5%
Happy 45.2%
Disgusted 45.1%
Confused 45.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.2%
Poster 57.6%

Categories

Text analysis

Amazon

296/
19