Human Generated Data

Title

Woman Dancing

Date

17th century

People

Artist: Michel Lasne, French before 1590 - 1667

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R4282

Human Generated Data

Title

Woman Dancing

People

Artist: Michel Lasne, French before 1590 - 1667

Date

17th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R4282

Machine Generated Data

Tags

Amazon
created on 2019-08-10

Person 96.3
Human 96.3
Art 88.5
Drawing 81.7
Leisure Activities 69.9
Sketch 63.1
Painting 60.7
Performer 60
Text 55.5

Clarifai
created on 2019-08-10

print 99.7
illustration 99.5
people 99.4
art 99.1
one 97.4
adult 97.2
engraving 96.5
man 95.6
vintage 95.4
antique 95.1
portrait 94.1
woman 93.6
retro 93.1
old 92.7
lithograph 92.6
painting 92.1
book bindings 89.5
wear 85.4
visuals 85
paper 83.1

Imagga
created on 2019-08-10

statue 45.3
book jacket 42.4
sculpture 37.1
jacket 33.1
art 32.7
religion 27.8
wrapping 25.1
architecture 19.6
god 19.1
church 18.5
covering 18.2
old 18.1
culture 17.9
monument 17.7
faith 16.3
dress 16.3
city 15.8
travel 15.5
religious 15
ancient 14.7
history 14.3
temple 14.2
historical 14.1
figure 13.7
face 13.5
building 13.5
spirituality 13.4
antique 12.8
catholic 12.6
holy 12.5
lady 12.2
famous 12.1
detail 12.1
cemetery 11.8
bust 11.6
tourism 11.5
brass 11.4
stone 11.2
golden 11.2
landmark 10.8
closeup 10.8
decoration 10.7
saint 10.6
person 10.1
empire 9.8
fashion 9.8
prayer 9.7
memorial 9.5
column 9.3
decorative 9.2
structure 9.2
historic 9.2
tourist 9.1
portrait 9.1
bible 8.8
man 8.7
pray 8.7
artistic 8.7
spiritual 8.6
cathedral 8.6
money 8.5
makeup 8.2
gold 8.2
museum 8.2
marble 8.1
currency 8.1
icon 7.9
oriental 7.8
traditional 7.5
vintage 7.4
style 7.4
people 7.2
color 7.2
crown 7.2
financial 7.1
romantic 7.1
interior 7.1

Google
created on 2019-08-10

Microsoft
created on 2019-08-10

text 100
book 99.8
drawing 97.7
sketch 97.6
woman 81.2
clothing 80.5
person 78.4
cartoon 76.9
painting 70.1
human face 60.7
dance 57.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 24-38
Gender Female, 91.6%
Confused 3.5%
Disgusted 11.5%
Calm 9.8%
Angry 4.7%
Happy 0%
Fear 5.6%
Sad 63.9%
Surprised 0.9%

Microsoft Cognitive Services

Age 24
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.3%

Categories

Imagga

paintings art 93.7%
events parties 5.1%

Captions

Microsoft
created on 2019-08-10

a close up of a book 46.4%
close up of a book 40.9%
a hand holding a book 40.8%

Text analysis

Amazon

Pouroit
Charlot
nais
hors
moy
de
la
dis
peu
vng
lo
Cum
GTi dis vray Charlot nais la Donne de
Donne
Juis
d'aleye
gnuenit
paler
Pouroit bien paler lo playfur: I Car je moy Juis vng tantgIt peu hors loyfir
Mafue gnuenit et fecit Marierte eeud Cum Prtuilegio d'aleye
je
Prtuilegio
et fecit
Car
Marierte
eeud
vray
bien
GTi
loyfir
tantgIt
Mafue
playfur:
I

Google

Donne moy vng peu de loyfi Car je fuis tantoft hors d'aleyne TiL dis vray Charlot mais lapeyne Pouroit bien paler lo play d Cum Priuilegio MLafue muenie ct fecit T
Car
d'aleyne
TiL
dis
vray
mais
lapeyne
bien
paler
lo
play
d
Priuilegio
fecit
T
Donne
moy
vng
peu
de
loyfi
je
fuis
tantoft
hors
Charlot
Pouroit
Cum
MLafue
muenie
ct