Human Generated Data

Title

The Seer (Maboroshi), Illustration to Chapter 41 of the "Tale of Genji" (Genji monogatari)

Date

17th century

People

-

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Mrs. John T. Linzee, 1931.249

Human Generated Data

Title

The Seer (Maboroshi), Illustration to Chapter 41 of the "Tale of Genji" (Genji monogatari)

Date

17th century

Classification

Paintings

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Bequest of Mrs. John T. Linzee, 1931.249

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Poster 98
Advertisement 98
Person 83.3
Human 83.3
Drawing 82.8
Art 82.8
Collage 73.7
Soil 67.6
Text 67.4
Plan 66.1
Plot 66.1
Diagram 66.1
People 61.5
Sketch 58.7
Archaeology 55.9

Clarifai
created on 2020-04-24

people 99.9
print 99.2
group 98.7
adult 98.3
illustration 97.9
war 97.3
vehicle 96.8
one 96.3
two 96.1
soldier 95.9
many 95.5
man 95.4
military 95.3
engraving 95
wear 94.6
skirmish 94.2
art 92.9
transportation system 90.5
painting 89.8
group together 89.6

Imagga
created on 2020-04-24

book jacket 25.9
money 25.5
cash 24.7
finance 23.6
currency 23.3
equipment 22.8
bank 22.4
business 21.2
covering 21.1
box 20.6
wealth 20.6
jacket 20.1
paper 19.6
lock 19.5
savings 18.6
blister pack 17
container 16.8
security 16.5
wrapping 16.3
old 16
packaging 15.3
open 15.3
binder 14.8
banking 14.7
metal 14.5
protective covering 13.3
financial 13.3
dollar 13
safe 12.9
device 12.7
note 11.9
briefcase 11.8
dollars 11.6
secure 11.6
storage 11.4
office 11.4
technology 11.1
investment 11
carpenter's kit 10.9
symbol 10.8
vintage 10.7
bills 10.7
home 10.4
close 10.3
object 10.3
safety 10.1
hundred 9.7
kit 9.6
exchange 9.5
bill 9.5
save 9.5
rich 9.3
3d 9.3
protection 9.1
sign 9
retro 9
computer 8.9
steel 8.8
deposit 8.8
billboard 8.7
treasure 8.7
empty 8.6
design 8.5
black 8.4
key 8.4
stack 8.3
letter 8.2
gold 8.2
data 8.2
case 8.1
information 8
vault 7.9
book 7.7
payment 7.7
door 7.6
envelope 7.6
structure 7.4
signboard 7

Google
created on 2020-04-24

Vehicle 51.9

Microsoft
created on 2020-04-24

text 98.9
drawing 98.5
sketch 93.7
black and white 88.1
painting 85.1
construction 66
old 52.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-46
Gender Female, 53.4%
Happy 50.1%
Calm 45.9%
Sad 46.9%
Disgusted 45.2%
Fear 45.6%
Surprised 45.3%
Angry 45.8%
Confused 45.2%

Feature analysis

Amazon

Poster 98%
Person 83.3%

Categories

Captions

Microsoft
created on 2020-04-24

an old photo of a person 31.9%
old photo of a person 29%
an old photo of a building 28.9%

Text analysis

Amazon

1931.
2449
1931. 9 2449
9

Google

1931.249
1931.249