Human Generated Data

Title

Entombment

Date

19th century

People

Artist: Samuel Amsler, German 1791 - 1849

Artist after: Raphael, Italian 1483 - 1520

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G40

Human Generated Data

Title

Entombment

People

Artist: Samuel Amsler, German 1791 - 1849

Artist after: Raphael, Italian 1483 - 1520

Date

19th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G40

Machine Generated Data

Tags

Amazon
created on 2019-11-06

Human 96.7
Person 96.7
Art 95
Person 94.6
Person 93.4
Person 91
Person 86.4
Person 86.3
Painting 71.8
Person 67.6

Clarifai
created on 2019-11-06

art 99.4
people 99.2
illustration 97.7
man 97.5
adult 96.6
print 95.6
group 93.3
Renaissance 93.1
painting 92.7
woman 92.4
baby 89.1
two 87.9
baroque 87.7
portrait 86
picture frame 84.2
one 83
old 82.7
nude 80.5
child 79.5
leader 76.9

Imagga
created on 2019-11-06

book jacket 65.5
jacket 52
wrapping 38.7
old 32.7
vintage 28.1
covering 26.5
grunge 23.8
ancient 23.3
art 23.1
antique 21.7
aged 20.8
retro 19.7
sculpture 18.4
texture 17.4
wall 16.3
paper 15.7
product 15.6
dirty 15.4
newspaper 15.3
design 14.8
letter 14.7
structure 13.5
rusty 13.3
decoration 13
detail 12.9
creation 12.6
statue 12
frame 11.8
postmark 11.8
postage 11.8
postal 11.8
architecture 11.7
stamp 11.6
carving 11.6
mail 11.5
historic 11
envelope 10.5
damaged 10.5
comic book 10.4
stone 10.3
black 10.2
circa 9.9
material 9.8
textured 9.6
grungy 9.5
blank 9.4
monument 9.3
close 9.1
border 9
renaissance 8.8
memorial 8.7
book 8.7
card 8.6
empty 8.6
post 8.6
culture 8.5
pattern 8.2
global 8.2
building 8.1
brown 8.1
closeup 8.1
currency 8.1
shows 7.9
printed 7.9
paintings 7.8
parchment 7.7
wallpaper 7.7
money 7.7
old fashioned 7.6
weathered 7.6
unique 7.6
page 7.4
grain 7.4
message 7.3
rough 7.3
office 7.2
surface 7.1

Google
created on 2019-11-06

Microsoft
created on 2019-11-06

gallery 99
text 97.5
room 95.8
scene 95.5
person 95.2
clothing 92.1
box 75.8
old 67.8
man 65.6
posing 64
woman 59.6
painting 58.3
different 55
vintage 40.7
picture frame 10.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 2-8
Gender Female, 53.4%
Disgusted 45%
Confused 45%
Angry 45%
Calm 45%
Sad 54.9%
Fear 45%
Happy 45%
Surprised 45%

AWS Rekognition

Age 23-37
Gender Male, 53.8%
Confused 45.1%
Angry 52.9%
Happy 45.1%
Sad 45%
Fear 45.4%
Disgusted 45.1%
Calm 46.1%
Surprised 45.2%

AWS Rekognition

Age 26-42
Gender Male, 53.7%
Happy 45%
Disgusted 45%
Fear 45.1%
Calm 45%
Angry 54.8%
Sad 45%
Confused 45.1%
Surprised 45%

AWS Rekognition

Age 24-38
Gender Male, 53.3%
Confused 45.1%
Fear 45%
Angry 49.6%
Calm 49.8%
Surprised 45.1%
Sad 45.2%
Happy 45%
Disgusted 45.2%

AWS Rekognition

Age 21-33
Gender Male, 53.8%
Angry 54.2%
Surprised 45%
Fear 45.1%
Disgusted 45.1%
Confused 45%
Sad 45%
Happy 45%
Calm 45.5%

AWS Rekognition

Age 20-32
Gender Female, 53.1%
Angry 45.5%
Disgusted 45.1%
Surprised 45%
Fear 45.3%
Calm 53.5%
Confused 45%
Sad 45.5%
Happy 45.1%

AWS Rekognition

Age 22-34
Gender Male, 52.3%
Angry 46.9%
Fear 45.1%
Sad 45.4%
Surprised 45%
Calm 52.6%
Disgusted 45%
Confused 45%
Happy 45%

AWS Rekognition

Age 19-31
Gender Male, 51.9%
Sad 45.6%
Confused 45.1%
Fear 45%
Happy 45.1%
Angry 45.5%
Disgusted 45%
Calm 53.6%
Surprised 45%

Feature analysis

Amazon

Person 96.7%
Painting 71.8%

Categories

Imagga

pets animals 52.8%
paintings art 46.9%

Text analysis

Amazon

piyxir
NPHAR
VBBYAS
U.

Google

APHAR VRBNAS
APHAR
VRBNAS