Human Generated Data

Title

Temple Garden

Date

Early to Mid Edo period, circa late 17th to early 18th century

People

Artist: Hishikawa Moronobu, Japanese d. 1694

Classification

Prints

Human Generated Data

Title

Temple Garden

People

Artist: Hishikawa Moronobu, Japanese d. 1694

Date

Early to Mid Edo period, circa late 17th to early 18th century

Classification

Prints

Machine Generated Data

Tags

Amazon

Plot 95.7
Diagram 95.5
Plan 95.5
Art 89.3
Painting 89.3
Human 83.6
Person 83.6
Person 78.9
Map 58.3

Clarifai

paper 98.7
painting 98.4
illustration 98.3
old 97.7
ancient 97.4
art 97.3
map 97
retro 96.5
antique 96.3
wall 96.1
text 95.8
architecture 95.4
manuscript 95.2
print 94.9
guidance 94.6
desktop 94.3
vintage 94.2
no person 93.6
dirty 92.2
picture frame 91.7

Imagga

map 100
representation 84.6
atlas 46.5
geography 45.3
antique 45.2
vintage 44.8
old 43.3
world 37.3
continent 32.1
navigation 31.8
globe 31.6
travel 30.3
plan 30.3
sepia 29.2
country 28.4
route 28.4
nation 27.5
location 27.5
discovery 27.3
navigate 26.5
road 26.3
direction 25.8
capital 25.7
planet 25.5
wallpaper 25.3
city 25
geographic 24.7
expedition 24.7
retro 24.6
journey 24.5
guide 24.5
paper 24.4
boundary 23.7
find 23.5
position 23.5
gold 23.1
tour 22.3
tourism 21.5
states 19.4
grunge 18.8
explore 18.6
sketch 17
envelope 16.6
state 16.3
drawing 16.1
ancient 15.6
path 15.2
graffito 15
art 15
texture 13.9
dutch 13.6
design 13
earth 12.8
global 12.8
aged 12.7
note 12
money 11.9
page 11.2
decoration 10.9
stamp 10.9
border 10.9
maps 10.9
countries 10.7
finance 10.2
frame 10
backdrop 9.9
pattern 9.6
grungy 9.5
business 9.1
history 9
card 8.8
torn 8.7
mail 8.6
international 8.6
floral 8.5
destination 8.4
letter 8.3
currency 8.1
financial 8
postmark 7.9
postage 7.9
cartography 7.9
structure 7.8
bookmark 7.8
north 7.7
canvas 7.6
decorative 7.5
style 7.4
symbol 7.4
cash 7.3
graphic 7.3
bank 7.2

Google

Text 85.2
Art 81.6
Drawing 79.8
Line 78.7
Sketch 70.9
Illustration 69.1
Visual arts 68.6
Painting 66.1
Artwork 62.2
House 59.3
History 54.1

Microsoft

drawing 99.2
sketch 94.8
text 93.1
map 90.1
child art 71
handwriting 67.9

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Male, 55%
Happy 45.1%
Surprised 45.4%
Calm 46.4%
Confused 46.6%
Sad 47.6%
Disgusted 48.2%
Angry 45.7%

AWS Rekognition

Age 26-43
Gender Female, 50.2%
Confused 49.5%
Angry 49.6%
Disgusted 49.6%
Sad 50%
Calm 49.6%
Happy 49.6%
Surprised 49.6%

AWS Rekognition

Age 26-43
Gender Male, 50.3%
Sad 49.7%
Calm 50.2%
Confused 49.5%
Happy 49.5%
Disgusted 49.5%
Angry 49.6%
Surprised 49.5%

AWS Rekognition

Age 10-15
Gender Female, 50.4%
Disgusted 49.5%
Happy 49.5%
Calm 49.6%
Sad 50.3%
Angry 49.6%
Confused 49.5%
Surprised 49.5%

Feature analysis

Amazon

Painting 89.3%
Person 83.6%

Captions

Microsoft

a close up of a map 77.5%
close up of a map 73.4%
a map with text 65.8%