Human Generated Data

Title

A Cardinal

Date

19th century

People

Artist: Johann Georg Schreiner, German 1801 - 1859

Artist after: Raphael, Italian 1483 - 1520

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G3631

Human Generated Data

Title

A Cardinal

People

Artist: Johann Georg Schreiner, German 1801 - 1859

Artist after: Raphael, Italian 1483 - 1520

Date

19th century

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-11-06

Human 98.4
Person 98.4
Art 96.9
Drawing 89.3
Face 89.1
Text 85
Clothing 80.6
Apparel 80.6
Painting 79
Photo 67.8
Photography 67.8
Portrait 67.8
Advertisement 62.9
Sketch 62.2
Poster 61.4
Hat 60.6

Clarifai
created on 2019-11-06

portrait 99.6
people 99.4
one 98.5
adult 98.4
art 97.4
retro 94.9
wear 94
print 93.7
paper 92
man 91.3
bill 89.3
vintage 87.9
painting 86.1
book bindings 84.9
blank 84.7
illustration 83.8
lid 83.1
child 83.1
person 82.9
card 81.7

Imagga
created on 2019-11-06

mug shot 41.8
book jacket 38.1
jacket 32.6
photograph 32.5
creation 30.4
portrait 27.8
representation 26.7
adult 25.2
person 25
model 23.3
man 22.9
male 22.7
wrapping 22.6
face 21.3
book 20.1
covering 19.8
attractive 19.6
black 19.3
expression 18.8
product 17.9
people 17.3
money 16.2
close 15.4
fashion 15.1
business 14.6
sexy 13.7
serious 13.4
eyes 12.9
cash 12.8
guy 12.6
currency 12.6
bill 12.4
lady 12.2
smile 12.1
human 12
hair 11.9
dark 11.7
posing 11.6
bank 10.8
brunette 10.5
one 10.5
looking 10.4
women 10.3
youth 10.2
dollar 10.2
banking 10.1
head 10.1
cute 10
studio 9.9
handsome 9.8
pretty 9.8
sad 9.6
office 9.6
skin 9.6
casual 9.3
hand 9.1
sensual 9.1
financial 8.9
look 8.8
closeup 8.8
banknote 8.7
smiling 8.7
lifestyle 8.7
economy 8.4
vintage 8.3
friendly 8.2
happy 8.1
paper 8.1
dress 8.1
hat 8.1
wealth 8.1
body 8
boy 8
bad 7.8
space 7.8
blond 7.7
old 7.7
exchange 7.6
hairstyle 7.6
finance 7.6
laptop 7.6
lips 7.4
beard 7.2
eye 7.2
market 7.1

Google
created on 2019-11-06

Microsoft
created on 2019-11-06

wall 98.6
drawing 98.3
sketch 96.7
gallery 96.6
human face 94.3
scene 92.4
room 92.3
indoor 89.2
clothing 72.3
hat 70.7
person 65.4
text 63.2
smile 56
portrait 53.9
old 48.4
picture frame 27.7
painting 25.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 40-58
Gender Male, 93.7%
Calm 1.8%
Angry 44%
Disgusted 16.6%
Surprised 4.3%
Sad 1.5%
Happy 17.4%
Fear 4%
Confused 10.4%

Microsoft Cognitive Services

Age 70
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%

Captions

Microsoft

a painting of a man 82%
a painting of a man in a room 80.9%
a painting of a man in a white room 67.3%

Text analysis

Amazon

o.
70280 o.
70280
idlaleyind
itorulu
itorulu idlaleyind mill mhiny
mill
mhiny

Google

Ldioaleng
poamndiny
Ldioaleng he poamndiny
he