Human Generated Data

Title

The Lamp

Date

August 31, 1890

People

Artist: Georges Lemmen, Belgian 1865 - 1916

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of William S. Lieberman, by exchange, 2007.138

Human Generated Data

Title

The Lamp

People

Artist: Georges Lemmen, Belgian 1865 - 1916

Date

August 31, 1890

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of William S. Lieberman, by exchange, 2007.138

Machine Generated Data

Tags

Amazon
created on 2020-04-29

Art 96.8
Painting 96.8
Human 90.9
Drawing 90.3
Person 81.2
Sketch 74.1

Clarifai
created on 2020-04-29

art 98.9
illustration 98.1
people 97.6
vintage 97.6
group 96.9
old 95.3
retro 93.8
man 92
painting 91.9
desktop 91.7
ancient 90.4
print 90.4
mammal 89.6
paper 89.3
fish 88.2
cat 88
antique 87.7
cavalry 86.6
chalk out 86.6
adult 86.3

Imagga
created on 2020-04-29

sketch 57.9
drawing 45.4
representation 38.1
currency 34.1
money 34
paper 32.2
dollar 31.6
cash 30.2
finance 28.7
vintage 25.9
dollars 25.1
wealth 24.3
bank 23.3
bill 22.8
business 22.5
antique 22.1
banking 22.1
hundred 21.3
rich 20.5
us 19.3
loan 19.2
retro 18.9
exchange 18.1
art 18
grunge 17.9
savings 17.7
franklin 16.7
old 16.7
bills 16.5
financial 16
ancient 14.7
design 14.6
finances 14.5
sign 14.3
one 13.4
pay 13.4
pattern 12.3
frame 11.9
stamp 11.9
decoration 11.8
market 11.5
close 11.4
investment 11
history 10.7
gold 10.7
treasury 10.5
texture 10.4
economy 10.2
postmark 9.9
states 9.7
mail 9.6
greenback 8.9
postage 8.9
postal 8.8
wages 8.8
funds 8.8
post 8.6
number 8.4
note 8.3
painting 8.1
symbol 8.1
object 8.1
decor 8
architecture 7.9
paying 7.8
floral 7.7
capital 7.6
decorative 7.5
traditional 7.5
element 7.4
style 7.4
letter 7.3
panel 7.3
heraldry 7.3
card 7.3
success 7.2
collection 7.2
creation 7.1

Google
created on 2020-04-29

Microsoft
created on 2020-04-29

text 99.9
drawing 99.5
book 99.2
sketch 99.1
illustration 94.4
painting 90
cartoon 88.6
art 87.3
child art 73.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-33
Gender Female, 93.2%
Surprised 0.3%
Happy 2.5%
Confused 0.3%
Calm 62.7%
Disgusted 0.2%
Sad 31%
Fear 1.3%
Angry 1.7%

Feature analysis

Amazon

Painting 96.8%
Person 81.2%

Categories

Captions

Microsoft
created on 2020-04-29

a close up of a book 53.7%
close up of a book 47.8%
a close up of a book cover 47.7%

Text analysis

Amazon

-3ia,9r
Hd

Google

31 aaiir, go. 604P
31
aaiir,
go.
604P