Human Generated Data

Title

The Seated Virgin

Date

18th-19th century

People

Artist: Johann Gotthard von Müller, German 1747 - 1830

Artist after: Raphael, Italian 1483 - 1520

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G2836

Human Generated Data

Title

The Seated Virgin

People

Artist: Johann Gotthard von Müller, German 1747 - 1830

Artist after: Raphael, Italian 1483 - 1520

Date

18th-19th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G2836

Machine Generated Data

Tags

Amazon
created on 2019-08-10

Human 97.8
Person 97.8
Art 96.1
Person 95.5
Painting 89.4
Person 87.9
Archangel 60.2
Angel 60.2

Clarifai
created on 2019-08-10

people 99.9
portrait 99.6
art 99.5
baby 99.4
two 99
adult 98.8
woman 95.7
one 95.6
affection 95.2
print 95
child 94.9
love 94.7
man 94.1
family 93.3
son 93.3
illustration 91.9
group 89.6
religion 89.4
girl 88.6
Renaissance 88.5

Imagga
created on 2019-08-10

child 31.1
baby 30.1
sketch 29.1
parent 22.6
currency 22.4
money 22.1
cash 21
drawing 21
mother 20
representation 19.3
love 18.1
portrait 18.1
family 16.9
finance 16.9
dollar 16.7
paper 16.5
business 16.4
people 16.2
man 16.1
one 15.7
banking 15.6
face 15.6
father 15.5
close 14.8
dollars 14.5
financial 14.2
male 14.1
savings 14
cute 13.6
dad 13.5
wealth 13.5
boy 13
person 12.9
bank 12.5
adult 12.3
home 12
happy 11.9
banknote 11.6
childhood 11.6
hand 11.5
bill 11.4
life 11.2
head 10.9
newborn 10.7
hundred 10.6
kid 10.6
little 10.6
old 10.4
economy 10.2
closeup 10.1
holding 9.9
banknotes 9.8
affectionate 9.7
us 9.6
finances 9.6
pay 9.6
smiling 9.4
happiness 9.4
daughter 9.2
investment 9.2
black 9.1
human 9
concepts 8.9
looking 8.8
bills 8.7
infant 8.7
lifestyle 8.7
neonate 8.6
loan 8.6
bed 8.5
china 8.5
rich 8.4
adorable 8.3
body 8
hands 7.8
attractive 7.7
expression 7.7
cat 7.6
rest 7.4
children 7.3
smile 7.1
porcelain 7.1
son 7.1

Google
created on 2019-08-10

Microsoft
created on 2019-08-10

baby 99.3
text 96.9
toddler 96.4
human face 95.4
person 91.3
clothing 82.6
child 70.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 11-21
Gender Female, 99.1%
Sad 0.1%
Disgusted 0%
Angry 0%
Surprised 0%
Calm 99.6%
Fear 0%
Confused 0%
Happy 0.2%

AWS Rekognition

Age 0-4
Gender Female, 75%
Angry 0.9%
Disgusted 2.2%
Sad 3.3%
Surprised 2.4%
Fear 0.3%
Calm 89.5%
Happy 0.4%
Confused 1%

AWS Rekognition

Age 4-14
Gender Female, 83.9%
Confused 58.1%
Angry 4.5%
Surprised 19.7%
Fear 1.5%
Sad 1.2%
Calm 12.3%
Disgusted 2.7%
Happy 0.2%

Microsoft Cognitive Services

Age 5
Gender Female

Microsoft Cognitive Services

Age 3
Gender Female

Microsoft Cognitive Services

Age 24
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.8%
Painting 89.4%

Categories

Imagga

paintings art 100%

Captions

Text analysis

Google

ergeo
TO
C
La ergeo TO C
La