Human Generated Data

Title

Men and women in front of a building, one sheet of a multi-sheet print

Date

-

People

Artist: Kitagawa Tsukimaro, Japanese ? - 1830

Artist: Utagawa Toyokuni 歌川豊国, Japanese 1769 - 1825

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Mrs. Flora Virginia Livingston, 1950.66.5

Human Generated Data

Title

Men and women in front of a building, one sheet of a multi-sheet print

People

Artist: Kitagawa Tsukimaro, Japanese ? - 1830

Artist: Utagawa Toyokuni 歌川豊国, Japanese 1769 - 1825

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Mrs. Flora Virginia Livingston, 1950.66.5

Machine Generated Data

Tags

Amazon
created on 2019-07-23

Art 96.7
Painting 96.2
Human 84.5
Person 84.5
Person 84.5
Person 81.8
Person 79.1
Person 69.7
Drawing 66.7
Tapestry 57.4
Ornament 57.4
Person 43.2

Clarifai
created on 2019-07-23

art 99.1
illustration 98.7
painting 98.6
retro 97.2
people 97.1
old 96.9
ancient 96.3
print 96.2
paper 96.1
vintage 95.6
man 93.7
antique 92.3
manuscript 91.7
adult 88.5
artistic 87.5
wall 87.4
religion 87.2
chalk out 85.5
culture 83
Gothic 82.6

Imagga
created on 2019-07-23

drawing 58.4
sketch 58.2
representation 56.5
map 51.2
vintage 29
art 25.1
atlas 25
antique 24.6
design 24.3
retro 21.3
old 20.9
pattern 19.9
graphic 19.7
wallpaper 18.4
pirate 17.5
geography 17.4
capital 17.1
grunge 17.1
decorative 16.7
travel 16.2
world 16
paper 15.7
cartoon 15.2
element 14.9
money 14.5
currency 14.4
decoration 14.4
set 13.6
finance 13.5
navigation 13.5
symbol 13.5
frame 13.3
globe 13
business 12.8
discovery 12.7
continent 12.6
tour 12.6
gold 12.4
plan 12.3
icon 11.9
country 11.8
location 11.8
route 11.7
collection 11.7
sepia 11.7
silhouette 11.6
financial 11.6
comic book 11.6
flower 11.6
direction 11.4
journey 11.3
shape 11.2
dollar 11.2
texture 11.1
border 10.9
city 10.8
navigate 10.8
guide 10.8
tourism 10.7
states 10.7
ancient 10.4
planet 10.4
floral 10.2
economy 10.2
clip art 10.2
template 10.1
ornate 10.1
road 10
geographic 9.9
expedition 9.9
boundary 9.9
find 9.8
sign 9.8
position 9.8
scroll 9.5
tile 9.5
nation 9.5
traditional 9.2
card 9
bank 9
style 8.9
decor 8.9
dollars 8.7
holiday 8.6
grungy 8.6
coffee 8.4
backdrop 8.3
earth 8.2
animal 7.9
doodle 7.9
explore 7.8
stylized 7.7
menu 7.6
page 7.4
envelope 7.3
cash 7.3
painting 7.2
history 7.2
market 7.1
modern 7
leaf 7

Google
created on 2019-07-23

Art 90.8
Tapestry 80.8
Textile 79.4
Illustration 78
Painting 75
History 64.5
Drawing 57.6

Microsoft
created on 2019-07-23

text 99.3
cartoon 99.2
drawing 99
book 98.5
sketch 95.4
illustration 89.4
person 83.9
clothing 78.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Male, 55%
Happy 52.1%
Calm 46.9%
Surprised 45.1%
Sad 45.5%
Angry 45.2%
Disgusted 45.1%
Confused 45.2%

AWS Rekognition

Age 20-38
Gender Male, 50.7%
Confused 45.5%
Disgusted 45.3%
Angry 45.6%
Calm 51.5%
Sad 46.5%
Surprised 45.5%
Happy 45.2%

AWS Rekognition

Age 26-43
Gender Male, 53.9%
Angry 45.9%
Disgusted 45.6%
Sad 47.7%
Happy 45.5%
Confused 45.4%
Surprised 45.9%
Calm 48.9%

AWS Rekognition

Age 20-38
Gender Male, 55%
Sad 46.7%
Confused 49.2%
Disgusted 45.4%
Angry 45.8%
Surprised 45.5%
Happy 45.6%
Calm 46.7%

AWS Rekognition

Age 20-38
Gender Male, 53.1%
Sad 45.6%
Happy 45.2%
Angry 45.4%
Surprised 45.4%
Confused 45.2%
Disgusted 45.5%
Calm 52.7%

AWS Rekognition

Age 26-43
Gender Male, 54.9%
Sad 46.4%
Happy 45.1%
Surprised 45.4%
Disgusted 45.1%
Calm 50.5%
Angry 47.1%
Confused 45.4%

AWS Rekognition

Age 26-43
Gender Male, 52.1%
Confused 45%
Sad 45.1%
Disgusted 45%
Calm 45%
Surprised 45.1%
Angry 45.1%
Happy 54.6%

AWS Rekognition

Age 29-45
Gender Male, 55%
Surprised 45.2%
Confused 45.4%
Sad 45.7%
Happy 45.1%
Disgusted 45.1%
Calm 53.4%
Angry 45.3%

AWS Rekognition

Age 20-38
Gender Male, 55%
Surprised 45.1%
Angry 45.1%
Confused 45.2%
Disgusted 45.1%
Calm 54.3%
Sad 45.1%
Happy 45.1%

Feature analysis

Amazon

Person 84.5%

Categories

Imagga

paintings art 99.5%

Captions

Microsoft
created on 2019-07-23

a close up of a book 47.3%
close up of a book 42.1%
a hand holding a book 38.6%

Text analysis

Amazon

i