Human Generated Data

Title

Andrea di Odoni

Date

17th century

People

Artist: Cornelis Visscher, Dutch c. 1629 - 1658

Artist after: Lorenzo Lotto, Italian 1480 - 1556/1557

Classification

Prints

Human Generated Data

Title

Andrea di Odoni

People

Artist: Cornelis Visscher, Dutch c. 1629 - 1658

Artist after: Lorenzo Lotto, Italian 1480 - 1556/1557

Date

17th century

Classification

Prints

Machine Generated Data

Tags

Amazon

Painting 97.4
Art 97.4
Human 95.9
Person 95.9
Person 85.2
Person 82.4
Portrait 61.1
Photography 61.1
Photo 61.1
Face 61.1

Clarifai

people 100
art 99.7
adult 99.7
print 99.6
facial hair 99.4
group 99.2
portrait 99.2
painting 99.1
furniture 99
leader 98.3
engraving 98.2
one 98.1
man 98.1
illustration 98
canine 97.7
two 97.4
mammal 95.8
dog 95.7
Renaissance 93.5
lithograph 93.3

Imagga

currency 49.4
money 48.5
cash 43
dollar 40.9
finance 35.5
bank 34.1
bill 32.3
newspaper 31.7
wealth 31.4
dollars 30.9
banking 30.3
cadaver 29
hundred 28.1
business 28
paper 27.5
financial 25.8
product 25.2
close 24
savings 21.4
us 21.2
exchange 21
one 20.9
finances 20.2
creation 19.9
franklin 19.7
grandfather 19.2
bills 18.5
banknote 18.4
loan 18.2
investment 17.4
banknotes 16.6
pay 16.3
rich 14.9
economy 14.8
states 14.5
ancient 13.8
notes 13.4
sculpture 13.4
daily 13.2
note 12.9
culture 12.8
united 12.4
portrait 12.3
face 12.1
funds 11.8
statue 11.7
payment 11.5
profit 11.5
god 11.5
sign 11.3
success 11.3
old 11.2
wages 10.8
religion 10.8
concepts 10.7
market 10.7
capital 10.4
number 10.3
man 10.1
art 10
president 9.8
economic 9.7
price 9.6
closeup 9.4
commerce 9.3
ruler 9.2
vintage 9.1
grandma 9
history 8.9
rate 8.8
debt 8.7
stone 8.6
temple 8.5
male 8.5
people 8.4
person 8.3
kin 8.3
carving 8.2
greenback 7.9
twenty 7.9
legal 7.8
antique 7.8
paying 7.8
sales 7.7
coin 7.6
head 7.6
human 7.5
religious 7.5
stock 7.5
backgrounds 7.3
architecture 7

Microsoft

text 100
book 99.8
drawing 99.2
wall 99
sketch 98.3
indoor 97.3
person 91.3
human face 88.8
old 88.1
man 86.2
clothing 69.5
art 65.2
painting 16.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 32-48
Gender Male, 99.5%
Sad 1.2%
Disgusted 0%
Surprised 0.1%
Happy 0.1%
Angry 0.6%
Fear 0%
Confused 0.1%
Calm 97.9%

AWS Rekognition

Age 32-48
Gender Male, 98.5%
Fear 0.1%
Confused 0.5%
Calm 76%
Sad 21.8%
Disgusted 0.1%
Happy 0.5%
Surprised 0.3%
Angry 0.6%

Microsoft Cognitive Services

Age 48
Gender Male

Microsoft Cognitive Services

Age 48
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Painting 97.4%
Person 95.9%

Captions

Microsoft

a vintage photo of a man 90.1%
a vintage photo of a man holding a book 66.6%
an old photo of a man 66.5%

Text analysis

Google

eelu
The
tl
Cernelae eelu The Cunesy tl
Cernelae
Cunesy