Human Generated Data

Title

Scene from the play "Imoseyama"

Date

People

Artist: Utagawa Toyokuni 歌川豊国, Japanese 1769 - 1825

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Gift of the Friends of Arthur B. Duel, 1933.4.753

Human Generated Data

Title

Scene from the play "Imoseyama"

People

Artist: Utagawa Toyokuni 歌川豊国, Japanese 1769 - 1825

Date

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 94.8
Human 94.8
Apparel 85.8
Clothing 85.8
Person 83
Comics 82.2
Book 82.2
Person 79.5
Painting 73.8
Art 73.8
Text 56.9

Imagga
created on 2021-12-15

map 90.1
jigsaw puzzle 43.9
comic book 41.3
puzzle 40.5
representation 38.2
geography 31.8
game 29.1
atlas 25
world 24.8
travel 24.7
antique 24.5
old 23.7
globe 23.2
vintage 23.2
plan 19.9
country 19.8
capital 19
route 18.6
continent 18.5
city 18.3
navigation 18.3
direction 17.2
nation 17.1
art 16.9
print media 16.5
location 15.7
states 15.5
gold 14.8
global 14.6
road 14.5
symbol 14.2
paper 14.1
guide 13.7
grunge 13.6
planet 13.2
journey 13.2
navigate 12.8
discovery 12.7
sepia 12.6
state 12.5
tourism 12.4
drawing 12.1
cartography 11.8
color 11.7
tour 11.6
business 11.6
retro 11.5
mosaic 11.3
design 11.3
geographic 10.9
expedition 10.9
boundary 10.8
explore 10.8
find 10.8
position 10.8
wallpaper 10.7
money 10.2
dutch 9.7
north 9.6
graphic 9.5
earth 9.2
texture 9
decoration 9
currency 9
financial 8.9
ancient 8.7
finance 8.5
economy 8.4
pattern 8.2
religion 8.1
close 8
land 7.9
painted 7.6
destination 7.5
page 7.4
dollar 7.4
church 7.4
investment 7.3
border 7.2
painting 7.2
history 7.2
icon 7.1
idea 7.1

Google
created on 2021-12-15

Textile 87.6
Botany 87.3
Organism 86.6
Art 83.6
Painting 82.1
Plant 79.3
Adaptation 79.2
Tree 76.9
Illustration 73.5
Creative arts 73.4
Pattern 73.1
Poster 72.3
Drawing 71.8
Visual arts 69.8
Paper product 59.8
Font 57.1
Printmaking 53.8
Fictional character 53.1
Mythology 53
Paper 52.6

Microsoft
created on 2021-12-15

text 99.8
drawing 98.6
cartoon 97.4
painting 97.4
book 92.9
child art 91.3
clothing 75.6
person 75
poster 72.6
sketch 62.4
illustration 51.6

Face analysis

Amazon

AWS Rekognition

Age 13-25
Gender Male, 73.4%
Calm 86.7%
Happy 7.9%
Sad 2.4%
Surprised 1.3%
Confused 0.7%
Angry 0.5%
Disgusted 0.5%
Fear 0.1%

AWS Rekognition

Age 23-35
Gender Male, 78.7%
Calm 52.1%
Happy 21.5%
Surprised 13.5%
Angry 9.6%
Confused 1.1%
Disgusted 1%
Sad 0.9%
Fear 0.4%

AWS Rekognition

Age 19-31
Gender Male, 98.3%
Calm 72%
Happy 13.9%
Angry 5.2%
Surprised 3%
Confused 2.3%
Disgusted 1.7%
Sad 1.1%
Fear 0.8%

AWS Rekognition

Age 17-29
Gender Male, 76.6%
Calm 74.7%
Confused 16.6%
Surprised 6.2%
Sad 1.2%
Angry 0.5%
Happy 0.4%
Disgusted 0.3%
Fear 0.1%

AWS Rekognition

Age 16-28
Gender Female, 53.4%
Calm 81.9%
Surprised 5.6%
Confused 5.2%
Happy 3%
Angry 1.4%
Sad 1.3%
Fear 1%
Disgusted 0.4%

Feature analysis

Amazon

Person 94.8%
Painting 73.8%

Captions

Microsoft

a map of a book 32.4%
a close up of a map 32.3%
a book on top of a map 32.2%

Text analysis

Amazon

427

Google