Human Generated Data

Title

Michel Amelot

Date

17th century

People

Artist: Jean France Frosne, French ca. 1623 - after 1676

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R7675

Human Generated Data

Title

Michel Amelot

People

Artist: Jean France Frosne, French ca. 1623 - after 1676

Date

17th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R7675

Machine Generated Data

Tags

Amazon
created on 2019-07-30

Art 97.9
Painting 97.9
Human 97.3
Person 97.3
Drawing 59.4
Portrait 56.6
Face 56.6
Photography 56.6
Photo 56.6

Clarifai
created on 2019-07-30

people 99.7
one 99.1
portrait 96.5
adult 96.4
art 93.1
indoors 92.7
administration 92.4
print 91.9
engraving 91.6
pensive 91.1
leader 91.1
sit 90.9
man 90.6
wear 89.3
scholar 87.3
politician 85.5
woman 85
writer 79.3
antique 79
retro 77.5

Imagga
created on 2019-07-30

portrait 36.9
fashion 30.1
model 29.5
person 25.9
stucco 24.3
adult 23.9
face 22
attractive 21.7
pretty 21
sexy 20.1
hair 19.8
brunette 18.3
people 17.8
dress 17.2
posing 16.9
one 16.4
lady 16.2
window 16.1
cute 15.1
smile 15
happy 14.4
human 14.2
style 13.4
old 13.2
black 13
sensual 12.7
sensuality 12.7
man 12.1
lips 12
culture 12
building 11.9
architecture 11.9
make 11.8
clothes 11.2
skin 11
elegance 10.9
look 10.5
ancient 10.4
lifestyle 10.1
head 10.1
sculpture 10
looking 9.6
world 9.6
hairstyle 9.5
wall 9.5
adolescent 9.4
male 9.3
art 9.2
makeup 9.1
modern 9.1
vintage 9.1
framework 9.1
currency 9
juvenile 8.9
lovely 8.9
interior 8.8
body 8.8
clothing 8.7
urban 8.7
smiling 8.7
eyes 8.6
golden 8.6
elegant 8.6
youth 8.5
money 8.5
gold 8.2
structure 8.2
religion 8.1
antique 8
device 8
blond 7.9
business 7.9
stone 7.6
city 7.5
room 7.5
close 7.4
brown 7.4
banking 7.4
pose 7.2
brass 7.2
financial 7.1
covering 7.1

Google
created on 2019-07-30

Microsoft
created on 2019-07-30

drawing 98.7
sketch 98.3
art 97.4
person 94.4
human face 93.9
clothing 92.2
text 91.5
window 90.4
painting 87
white 76.8
black 75.8
portrait 72.6
museum 58.6
old 52.8
posing 36
picture frame 15.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Male, 98.9%
Surprised 1.1%
Angry 1.7%
Calm 10.4%
Confused 3.2%
Happy 0.2%
Sad 81.3%
Disgusted 2.1%

Microsoft Cognitive Services

Age 32
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 97.9%
Person 97.3%

Categories

Captions

Text analysis

Amazon

Frofne
1655
Feulpfit
I Frofne Feulpfit
AS
AS Ainl
Ainl
STETERE
I

Google

1646
1646