Human Generated Data

Title

View of the Interior of the Gankirō Tea House in Yokohama (Yokohama Gankirō no zu), published by Daikokuya Kinnosuke

Date

Late Edo period, fourth month of 1860

People

Artist: Utagawa Hiroshige II, Japanese 1826 - 1869

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of William S. Lieberman, 2007.214.11.3

Human Generated Data

Title

View of the Interior of the Gankirō Tea House in Yokohama (Yokohama Gankirō no zu), published by Daikokuya Kinnosuke

People

Artist: Utagawa Hiroshige II, Japanese 1826 - 1869

Date

Late Edo period, fourth month of 1860

Classification

Prints

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of William S. Lieberman, 2007.214.11.3

Machine Generated Data

Tags

Amazon
created on 2019-11-05

Human 97.6
Person 97.6
Art 95.9
Person 95.5
Person 93.3
Person 91.1
Person 90.8
Painting 84.9
Person 81.1
Person 80.6
Drawing 75.2
Mural 73
Poster 58.7
Advertisement 58.7
Sketch 55

Clarifai
created on 2019-11-05

people 98.5
no person 98.2
painting 97.3
art 94.8
lithograph 94.5
print 93.1
adult 93
furniture 92.7
group 92.2
illustration 91.8
travel 91.8
seat 91.6
architecture 90.3
indoors 88.3
one 87.9
home 87.7
room 86.9
house 86.4
outdoors 85.6
building 84.8

Imagga
created on 2019-11-05

shop 44.3
mercantile establishment 33.5
architecture 30.1
building 29
old 27.2
graffito 25.3
place of business 22.2
decoration 22
city 20
wall 20
window 18.2
door 17.4
travel 16.2
history 15.2
barbershop 15.1
house 15
street 14.7
ancient 14.7
bookshop 13.5
religion 13.4
vintage 13.2
tourism 13.2
structure 12.4
antique 12.1
town 12.1
church 12
urban 11.4
art 11.1
exterior 11.1
establishment 11
glass 10.9
tobacco shop 10.5
culture 10.3
cathedral 10.1
entrance 9.7
paper 9.4
stone 9.3
historic 9.2
facade 9.1
retro 9
style 8.9
brown 8.8
business 8.5
shoe shop 8.4
aged 8.1
home 8
yellow 8
balcony 7.9
wooden 7.9
design 7.9
historical 7.5
place 7.4
shelf 7.4
detail 7.2
color 7.2
landmark 7.2
bank 7.2

Google
created on 2019-11-05

Illustration 84.5
Painting 81.3
Art 76.7
House 70.2
Room 65.7
Architecture 65.5
Wood 65.3
Facade 63.7
Window 61
Door 59.3
Artwork 56
Building 55.4

Microsoft
created on 2019-11-05

text 92.8
cartoon 91.6
christmas tree 70.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-32
Gender Female, 50.8%
Angry 45%
Happy 45%
Surprised 45.1%
Sad 45%
Calm 54.9%
Confused 45%
Fear 45%
Disgusted 45%

AWS Rekognition

Age 13-23
Gender Female, 50.1%
Fear 49.6%
Happy 49.5%
Angry 49.9%
Confused 49.6%
Surprised 49.5%
Disgusted 49.5%
Sad 49.9%
Calm 49.5%

AWS Rekognition

Age 3-9
Gender Male, 50.4%
Disgusted 49.5%
Fear 49.9%
Happy 49.5%
Calm 49.5%
Surprised 49.5%
Angry 49.5%
Confused 49.8%
Sad 49.7%

AWS Rekognition

Age 6-16
Gender Female, 52.5%
Angry 45.2%
Disgusted 45%
Happy 45.3%
Sad 47.2%
Calm 52%
Fear 45.2%
Confused 45%
Surprised 45.1%

AWS Rekognition

Age 32-48
Gender Male, 50.5%
Disgusted 49.5%
Confused 49.5%
Calm 49.6%
Fear 49.6%
Happy 49.5%
Angry 49.6%
Sad 50.1%
Surprised 49.5%

AWS Rekognition

Age 23-35
Gender Female, 50.3%
Confused 49.5%
Fear 49.5%
Angry 49.5%
Calm 49.5%
Surprised 49.5%
Sad 50.4%
Happy 49.5%
Disgusted 49.6%

Feature analysis

Amazon

Person 97.6%

Categories

Captions

Microsoft
created on 2019-11-05

a graffiti covered building 48.1%
graffiti on a wall 48%
a graffiti covered wall 47.9%

Text analysis

Amazon

NRN
LEI
1I