Human Generated Data

Title

Woman and Dog under New Year's Decorations

Date

Edo period, circa 1782

People

Artist: Torii Kiyonaga, Japanese 1752 - 1815

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of Dr. Denman W. Ross, 1916.583

Human Generated Data

Title

Woman and Dog under New Year's Decorations

People

Artist: Torii Kiyonaga, Japanese 1752 - 1815

Date

Edo period, circa 1782

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-07-06

Art 92.4
Painting 80.9
Human 62.7
Drawing 60.1
Person 42.3

Clarifai
created on 2019-07-06

illustration 99.4
chalk out 98.4
print 97.9
art 97.6
people 96.9
man 94.5
no person 94.1
wear 93.5
retro 92.9
adult 92
lithograph 91.4
one 91.1
engraving 91.1
ancient 90.4
paper 89.2
woman 88.6
woodcut 87.5
antique 86.7
painting 86.6
vintage 86.5

Imagga
created on 2019-07-06

representation 100
sketch 100
drawing 100
art 28.5
retro 23
vintage 22.4
design 20.9
old 18.8
style 16.3
graphic 16.1
antique 16
decorative 15.9
artistic 14.8
ancient 14.7
line 14.6
black 14.4
symbol 14.2
silhouette 13.3
pattern 13
grunge 12.8
map 12.4
artwork 11.9
outline 11.4
decoration 11.3
ornament 11.2
texture 11.1
floral 11.1
paper 11
aged 10.9
cartoon 10.7
people 10.6
stamp 10
flower 10
paint 10
wallpaper 10
painting 9.9
sepia 9.7
shape 9.6
card 9.5
man 9.4
clip art 9.3
frame 9.3
backdrop 9.1
gold 9.1
creative 8.8
contour 8.7
elegant 8.6
draw 8.6
atlas 8.4
element 8.3
postmark 7.9
animal 7.9
continent 7.8
travel 7.8
modern 7.7
ink 7.7
geography 7.7
stylized 7.7
navigation 7.7
mail 7.7
capital 7.6
plan 7.6
human 7.5
elements 7.4
letter 7.3
icon 7.1
world 7.1
hand 7.1
curve 7

Google
created on 2019-07-06

Line art 93.4
Drawing 83.6
Sketch 72.9
Art 72.1
Illustration 70.1
Fictional character 52.8
Artwork 52.3

Microsoft
created on 2019-07-06

drawing 99.3
sketch 99
cartoon 95.1
illustration 91.1
child art 83.6
art 75.8
ink 66
fabric 14.6

Face analysis

Amazon

AWS Rekognition

Age 19-36
Gender Male, 54.8%
Sad 45.2%
Confused 46%
Happy 46.2%
Calm 50.9%
Surprised 45.5%
Angry 45.4%
Disgusted 45.8%

Feature analysis

Amazon

Person 42.3%

Captions

Microsoft

a fabric surface 40.5%
a close up of a fabric surface 40.4%
a close up of fabric 37.1%