Human Generated Data

Title

The Seated Virgin

Date

18th century

People

Artist: Johann Gotthard von Müller, German 1747 - 1830

Artist after: Raphael, Italian 1483 - 1520

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R3375

Human Generated Data

Title

The Seated Virgin

People

Artist: Johann Gotthard von Müller, German 1747 - 1830

Artist after: Raphael, Italian 1483 - 1520

Date

18th century

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 98
Human 98
Painting 95.6
Art 95.6
Person 93.4
Person 78.2

Clarifai
created on 2019-11-16

people 99.8
portrait 98.7
adult 97.9
two 97.7
art 97.1
baby 95.9
son 95.7
man 95.5
woman 95.2
family 94.5
child 94.5
love 93.9
affection 93.4
girl 89.3
group 88.8
ma 87.5
one 85.3
beautiful 80.9
three 80.1
retro 78.9

Imagga
created on 2019-11-16

money 40.8
baby 39.9
currency 38.6
cash 38.4
newspaper 36.7
dollar 35.3
wealth 30.5
dollars 29.9
product 28.8
paper 28.2
finance 27.9
bank 26.2
creation 26
banking 25.7
business 25.5
financial 24.9
bill 24.7
child 24
hundred 23.2
close 22.8
savings 22.4
parent 21.1
bills 20.4
us 20.2
franklin 18.7
one 18.7
rich 18.6
loan 18.2
portrait 16.8
mother 16.8
banknotes 16.6
finances 16.4
father 16.1
dad 16.1
love 15
fetus 14.9
pay 14.4
exchange 14.3
cute 13.6
closeup 13.5
profit 13.4
family 13.3
economy 13
neonate 12.9
face 12.8
banknote 12.6
people 12.3
man 12.1
investment 11.9
vertebrate 11.1
funds 10.8
states 10.6
kid 10.6
payment 10.6
united 10.5
old 10.4
capital 10.4
male 10.3
adult 10.3
note 10.1
happy 10
representation 9.8
wages 9.8
human 9.7
home 9.6
sign 9
life 9
concepts 8.9
daily 8.7
debt 8.7
notes 8.6
person 8.6
head 8.4
number 8.4
china 8.3
market 8
couple 7.8
happiness 7.8
value 7.8
infant 7.7
sketch 7.7
save 7.6
joy 7.5
senior 7.5
commerce 7.5
vintage 7.4
chordate 7.4
mature 7.4
adorable 7.4
success 7.2
smiling 7.2
looking 7.2
smile 7.1
little 7.1
together 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

baby 99.3
text 99.2
toddler 97.1
human face 96.6
person 93.5
indoor 89.7
clothing 87.4
child 78.7
posing 37.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 12-22
Gender Female, 99.4%
Calm 99.2%
Fear 0%
Angry 0.1%
Disgusted 0%
Sad 0.1%
Happy 0.5%
Surprised 0%
Confused 0%

AWS Rekognition

Age 0-3
Gender Female, 85.7%
Surprised 9.6%
Disgusted 1.1%
Sad 4.5%
Angry 0.4%
Calm 79.6%
Happy 2.1%
Fear 0.2%
Confused 2.6%

AWS Rekognition

Age 2-8
Gender Female, 55.5%
Angry 0.2%
Calm 92.2%
Sad 0.4%
Happy 0.1%
Fear 0.1%
Confused 1.4%
Surprised 5.3%
Disgusted 0.4%

Microsoft Cognitive Services

Age 23
Gender Female

Microsoft Cognitive Services

Age 23
Gender Female

Microsoft Cognitive Services

Age 5
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98%
Painting 95.6%

Captions

Microsoft

a vintage photo of a baby 42.6%

Text analysis

Amazon

CHAISE
LA
LA VTERGE A LA CHAISE
A
VTERGE
then
Moien?
jener flale then Moien?
flale
jener
Grranen
nwtar
Wm nwtar Grranen
Wm
adel
2niny.
2niny. Tanali
Tanali
Gisem

Google

Afenen
falate
LA
lhanMaryhen
VIERGE
A
Afenen falate lhanMaryhen LA VIERGE A LA CHAISE
CHAISE