Human Generated Data

Title

Ecce Homo

Date

19th century

People

Artist: William Henry Egleton, British active circa 1833-62

Artist after: Correggio (Antonio Allegri), Italian c. 1489 - 1534

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R14597

Human Generated Data

Title

Ecce Homo

People

Artist: William Henry Egleton, British active circa 1833-62

Artist after: Correggio (Antonio Allegri), Italian c. 1489 - 1534

Date

19th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, R14597

Machine Generated Data

Tags

Amazon
created on 2019-08-10

Person 98.5
Human 98.5
Art 96.9
Person 96.8
Person 91.5
Painting 90.4
Person 82.2

Clarifai
created on 2019-08-10

people 100
art 99.7
adult 99.7
portrait 99.5
two 99.2
man 98.9
facial hair 98.9
one 98.6
sit 98
furniture 97.7
engraving 97.1
print 95.9
gown (clothing) 95.5
leader 95.4
illustration 94.7
group 94.5
veil 93
Renaissance 91.8
kneeling 91.8
seat 91.2

Imagga
created on 2019-08-10

sculpture 59.5
statue 48.6
column 44.8
carving 39.1
art 34.6
religion 32.3
ancient 27.7
architecture 26.8
culture 24.8
sketch 24.1
stone 22
old 21.6
history 21.5
god 20.1
temple 20
figure 20
religious 18.8
monument 18.7
obelisk 17.4
drawing 17.4
historical 16
church 15.7
representation 15
plastic art 15
travel 14.8
historic 14.7
structure 14.6
spiritual 14.4
book jacket 14.3
famous 14
holy 13.5
detail 12.9
building 12.7
marble 12.6
city 12.5
vintage 12.4
antique 12.3
decoration 12.1
face 12.1
carved 11.7
tourism 11.6
spirituality 11.5
jacket 11.1
catholic 10.7
pray 10.7
head 10.1
roman 10
landmark 9.9
design 9.7
meditation 9.6
golden 9.5
money 9.4
close 9.1
museum 8.7
prayer 8.7
heritage 8.7
saint 8.7
wrapping 8.5
east 8.4
decorative 8.4
traditional 8.3
exterior 8.3
symbol 8.1
man 8.1
portrait 7.8
worship 7.7
faith 7.7
capital 7.6
one 7.5
cash 7.3
tourist 7.3
covering 7.1

Google
created on 2019-08-10

Microsoft
created on 2019-08-10

text 99.9
painting 97.4
drawing 97.4
book 96.6
human face 96.5
sketch 94.6
art 90.5
person 86.8
posing 79.3
man 56
cartoon 54.2
clothing 53.5
old 49

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Male, 80.9%
Angry 0.7%
Disgusted 0.5%
Sad 30%
Surprised 1%
Fear 2%
Calm 3.3%
Happy 0.1%
Confused 62.5%

AWS Rekognition

Age 23-35
Gender Male, 99.2%
Calm 98.5%
Happy 0.2%
Sad 0.7%
Disgusted 0.1%
Fear 0.1%
Angry 0.2%
Confused 0.1%
Surprised 0.3%

AWS Rekognition

Age 18-30
Gender Female, 92.9%
Calm 35.9%
Angry 0.1%
Surprised 0%
Confused 0%
Happy 0.2%
Sad 63.7%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 34-50
Gender Male, 97.1%
Surprised 6.3%
Angry 1.3%
Happy 0.5%
Confused 2.1%
Fear 5.1%
Calm 78.8%
Disgusted 0.5%
Sad 5.4%

AWS Rekognition

Age 13-25
Gender Female, 88.4%
Fear 0.1%
Calm 96.9%
Confused 0.1%
Happy 0.3%
Sad 0.7%
Surprised 1.7%
Angry 0.1%
Disgusted 0.1%

Microsoft Cognitive Services

Age 38
Gender Female

Microsoft Cognitive Services

Age 42
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Likely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Possible
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Painting 90.4%

Categories

Text analysis

Amazon

Teaia

Google

Engraved
by
.
E
deton
5
LI
Punted
Engraved by W . E deton Trafe 1.I 5 Teaia LI 6 Punted by Coneggio
W
Trafe
1.I
Teaia
6
Coneggio