Human Generated Data

Title

View of the Interior of the Gankirō Tea House in Yokohama (Yokohama Gankirō no zu), published by Daikokuya Kinnosuke

Date

Late Edo period, fourth month of 1860

People

Artist: Utagawa Hiroshige II, Japanese 1826 - 1869

Classification

Prints

Human Generated Data

Title

View of the Interior of the Gankirō Tea House in Yokohama (Yokohama Gankirō no zu), published by Daikokuya Kinnosuke

People

Artist: Utagawa Hiroshige II, Japanese 1826 - 1869

Date

Late Edo period, fourth month of 1860

Classification

Prints

Machine Generated Data

Tags

Amazon

Human 99
Person 99
Person 98.8
Person 97.8
Art 95
Painting 91.9
Person 90.5
Person 87.4
Mural 78.1
Home Decor 77.4

Clarifai

people 98.9
painting 96
art 95.2
building 92.7
adult 92.5
architecture 91.1
travel 90.9
group 90.4
religion 89
daylight 87.6
illustration 87.4
house 86.9
woman 86.5
city 86.1
print 85.9
man 85.3
no person 85
landscape 84.2
tourism 84.1
street 83.3

Imagga

city 39.9
architecture 33.8
building 32.6
street 27.6
old 23.7
shop 21.7
town 20.4
urban 20.1
travel 18.3
house 16.7
wall 16.6
structure 14.9
sky 14.7
mercantile establishment 14.5
window 14.2
brick 14.2
buildings 14.2
cityscape 13.2
history 12.5
ancient 12.1
landmark 11.7
houses 11.6
tourism 11.5
stall 11.2
church 11.1
door 11
glass 10.9
residential 10.5
attraction 10.5
style 10.4
historical 10.3
tourist 10.3
light 10
device 10
bookshop 10
tower 9.8
modern 9.8
place of business 9.6
decoration 9.6
roof 9.5
skyline 9.5
famous 9.3
historic 9.2
business 9.1
religion 9
narrow 8.9
cathedral 8.8
home 8.8
high 8.7
architectural 8.6
downtown 8.6
colorful 8.6
construction 8.6
industry 8.5
stone 8.5
skyscraper 8.3
room 8.2
metal 8
detail 8
night 8
yellow 7.9
scene 7.8
bridge 7.6
tile 7.5
exterior 7.4
design 7.3
bedroom 7.1

Google

Illustration 82.8
Art 79.6
Painting 75.3
Room 65.7
Architecture 65.5
Fiction 51.4
Artwork 51.1

Microsoft

Face analysis

Amazon

AWS Rekognition

Age 38-56
Gender Male, 50.2%
Fear 45%
Happy 45.3%
Sad 45.2%
Disgusted 45%
Calm 54.4%
Surprised 45%
Confused 45%
Angry 45%

AWS Rekognition

Age 49-67
Gender Male, 50.7%
Happy 46.2%
Calm 45.7%
Confused 45.1%
Disgusted 45.1%
Surprised 45%
Angry 46.8%
Sad 50.4%
Fear 45.7%

AWS Rekognition

Age 19-31
Gender Female, 50.4%
Calm 50.2%
Happy 49.5%
Fear 49.5%
Angry 49.8%
Surprised 49.5%
Disgusted 49.5%
Sad 49.5%
Confused 49.5%

AWS Rekognition

Age 31-47
Gender Male, 53.1%
Fear 45%
Sad 45%
Disgusted 45%
Happy 45.1%
Surprised 45%
Calm 54.8%
Confused 45%
Angry 45.1%

AWS Rekognition

Age 8-18
Gender Female, 77%
Fear 28%
Calm 5.6%
Disgusted 0.6%
Confused 3.4%
Sad 42.8%
Surprised 5.1%
Angry 11.3%
Happy 3.2%

Feature analysis

Amazon

Person 99%

Captions

Microsoft

a group of stuffed animals sitting on top of a wooden door 25.4%
a group of people standing in front of a building 25.3%
a group of people standing on top of a wooden door 25.2%

Text analysis

Amazon

194