Human Generated Data

Title

Optics

Date

1690

People

Artist: Louis Simonneau, French 1654/56 - 1727

Artist after: Michel II Corneille, French 1642 - 1708

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R1135NA

Human Generated Data

Title

Optics

People

Artist: Louis Simonneau, French 1654/56 - 1727

Artist after: Michel II Corneille, French 1642 - 1708

Date

1690

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R1135NA

Machine Generated Data

Tags

Amazon
created on 2019-11-09

Human 98.9
Person 98.9
Art 98
Person 97.1
Person 95.2
Person 89.3
Person 84.4
Painting 75.2

Clarifai
created on 2019-11-09

people 99.9
art 99.4
adult 98.8
illustration 98.2
one 97.6
print 97.1
man 95.7
engraving 95.5
group 95.4
painting 95.1
baby 94.2
religion 91.2
woman 91.1
portrait 91.1
Renaissance 90.5
book 89.9
saint 89.3
two 88.8
veil 86.1
reclining 85.8

Imagga
created on 2019-11-09

sundial 50.1
money 43.4
currency 41.3
timepiece 39.3
cash 35.7
dollar 34.3
finance 32.1
structure 30.5
measuring instrument 29.6
paper 29
bank 28.7
bill 27.6
wealth 26
business 25.5
financial 25
banking 24.8
brass 24.3
fountain 23.4
dollars 22.2
hundred 20.3
art 19.9
memorial 19.6
savings 19.6
instrument 19.4
us 18.3
close 18.3
device 17.9
one 17.2
rich 15.8
franklin 15.8
sketch 15.4
loan 15.3
tray 14.8
bills 14.6
banknote 14.6
pay 14.4
note 13.8
finances 13.5
exchange 13.4
concepts 13.3
mosaic 13.1
investment 12.8
banknotes 12.7
drawing 12.3
closeup 12.1
receptacle 12
pattern 11.6
artistic 11.3
success 11.3
old 11.1
grunge 11.1
graphic 10.9
design 10.7
texture 10.4
economy 10.2
color 10
container 9.8
retro 9.8
debt 9.6
black 9.6
profit 9.6
ingot 9.4
representation 9.4
generated 9.3
vintage 9.1
transducer 9.1
sign 9
digital 8.9
detail 8.9
states 8.7
light 8.7
antique 8.7
payment 8.7
shape 8.6
line 8.6
motion 8.6
wallpaper 8.4
sale 8.3
fractal 8.3
element 8.3
style 8.2
futuristic 8.1
funds 7.8
wages 7.8
space 7.8
notes 7.7
capital 7.6
ideas 7.5
commerce 7.5
number 7.5
symbol 7.4
backgrounds 7.3
effect 7.3
block 7.3
fantasy 7.2
soft 7.2
decoration 7.1

Google
created on 2019-11-09

Microsoft
created on 2019-11-09

text 100
book 100
drawing 95.8
person 94.3
painting 93.2
sketch 88.8
cartoon 74.6
clothing 71.7

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 21-33
Gender Female, 54.7%
Angry 45%
Surprised 45%
Disgusted 45%
Confused 45%
Calm 49.1%
Sad 45%
Fear 45%
Happy 50.9%

AWS Rekognition

Age 33-49
Gender Female, 51.1%
Disgusted 45.1%
Confused 45%
Fear 45%
Sad 45.1%
Calm 53.8%
Happy 45.7%
Surprised 45.1%
Angry 45.2%

AWS Rekognition

Age 39-57
Gender Male, 54.4%
Angry 45.3%
Surprised 45.1%
Happy 45.1%
Calm 53.6%
Sad 45.1%
Disgusted 45.1%
Confused 45%
Fear 45.8%

AWS Rekognition

Age 11-21
Gender Male, 51.7%
Disgusted 45%
Surprised 45.2%
Happy 52.2%
Fear 45%
Confused 45%
Angry 45%
Calm 47.5%
Sad 45.1%

AWS Rekognition

Age 23-35
Gender Male, 51.7%
Disgusted 47.1%
Angry 51.6%
Confused 45.1%
Surprised 45%
Fear 45.7%
Happy 45.1%
Calm 45.1%
Sad 45.3%

Microsoft Cognitive Services

Age 24
Gender Female

Feature analysis

Amazon

Person 98.9%
Painting 75.2%

Categories

Imagga

paintings art 97.6%
pets animals 1.9%

Captions

Microsoft
created on 2019-11-09

an old photo of a book 48.6%
a close up of a book 47.4%
an old photo of a person 47.3%

Text analysis

Amazon

LOPTTIQUE
N
Suonneausa
N.Comeitlespinrit:
Suonneausa solprit:
solprit:
AFarthxc
SFeguts
AFarthxc Chrror SFeguts
Chrror

Google

PICART Sumonneau,Saulpit N.Corneillegpineit AParicchesCaz Cheraau rue L OPTIQUE cques au grand sRemi
Sumonneau,Saulpit
N.Corneillegpineit
Cheraau
rue
L
OPTIQUE
cques
grand
sRemi
PICART
AParicchesCaz
au