Human Generated Data

Title

Tric-Trac Players

Date

17th century

People

Artist: Lucas Vorsterman, I, Flemish 1595 - 1675

Artist after: Adam de Coster, Flemish c. 1586-1643

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R4697

Human Generated Data

Title

Tric-Trac Players

People

Artist: Lucas Vorsterman, I, Flemish 1595 - 1675

Artist after: Adam de Coster, Flemish c. 1586-1643

Date

17th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R4697

Machine Generated Data

Tags

Amazon
created on 2019-08-06

Musical Instrument 99.5
Lute 99.5
Human 99.2
Person 99.2
Person 99.2
Person 97.6
Art 96.8
Painting 96.8
Person 96.2

Clarifai
created on 2019-08-06

people 99.9
group 99.2
child 98.7
woman 97.4
adult 96.9
two 95.4
three 94.9
portrait 93.9
man 93.7
music 93.3
group together 92.8
four 92.3
wear 91.6
family 90.3
art 89.8
son 88.2
five 87.9
several 86.6
furniture 84.4
sit 83.5

Imagga
created on 2019-08-06

money 20.4
currency 19.7
cash 19.2
art 19.1
bank 17
dollar 16.7
banking 16.5
sculpture 16.2
financial 16
wealth 15.3
church 14.8
business 14.6
one 14.2
person 13.9
finance 13.5
old 13.2
savings 13
people 12.8
dollars 12.5
religion 12.5
man 12.1
close 12
paper 11.9
black 11.8
faith 11.5
god 11.5
male 11.4
bill 11.4
rich 11.2
portrait 11
statue 10.8
antique 10.4
culture 10.2
economy 10.2
model 10.1
face 9.9
history 9.8
adult 9.8
fashion 9.8
banknote 9.7
holy 9.6
pay 9.6
symbol 9.4
religious 9.4
investment 9.2
dress 9
closeup 8.8
catholic 8.7
hair 8.7
prayer 8.7
finances 8.7
saint 8.7
cathedral 8.6
spiritual 8.6
exchange 8.6
famous 8.4
carving 8.3
vintage 8.3
figure 8.1
sexy 8
world 8
bible 7.8
ancient 7.8
us 7.7
pretty 7.7
capital 7.6
head 7.6
historical 7.5
human 7.5
city 7.5
product 7.3
detail 7.2
success 7.2
night 7.1
market 7.1
covering 7

Google
created on 2019-08-06

Microsoft
created on 2019-08-06

text 99.6
drawing 98.5
sketch 97.2
person 93.8
clothing 93.1
cartoon 91.6
book 90.3
painting 87.3
man 74.1
illustration 62.2
human face 51.4
old 40.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-38
Gender Female, 91.5%
Calm 7.1%
Disgusted 2.3%
Angry 3.8%
Confused 5.3%
Surprised 2.9%
Sad 40.2%
Happy 38.3%

AWS Rekognition

Age 26-43
Gender Male, 98.5%
Disgusted 2.6%
Happy 2%
Sad 6.3%
Surprised 4.4%
Calm 70.9%
Confused 8.4%
Angry 5.3%

AWS Rekognition

Age 38-59
Gender Male, 95.7%
Angry 15.7%
Surprised 4%
Happy 2.2%
Calm 42.4%
Confused 6.9%
Sad 25.8%
Disgusted 3.1%

AWS Rekognition

Age 35-52
Gender Male, 65.4%
Calm 77.9%
Sad 7.4%
Surprised 1.7%
Disgusted 1.4%
Happy 0.9%
Confused 6.7%
Angry 4%

Microsoft Cognitive Services

Age 39
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Painting 96.8%

Categories

Imagga

paintings art 99.7%

Text analysis

Amazon

361
Coster
Ci pnualy

Google

e 367 Veterman A per Cester ps
367
A
Cester
e
Veterman
per
ps