Human Generated Data

Title

Raising of Lazarus

Date

1675-1725

People

Artist: Unidentified Artist,

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Edward W. Forbes, 1953.204

Human Generated Data

Title

Raising of Lazarus

People

Artist: Unidentified Artist,

Date

1675-1725

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Edward W. Forbes, 1953.204

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Human 95.7
Person 95.7
Person 95
Art 94.4
Painting 93
Person 92.4
Person 85.6
Drawing 77.5
Home Decor 59.3
Person 58.3
Archaeology 56.7

Clarifai
created on 2020-04-24

people 100
print 99.8
adult 99.4
art 99.4
group 99.3
illustration 99.2
engraving 98.4
painting 97.5
one 97.5
two 96.3
man 96
leader 95.4
veil 94.1
weapon 94
wear 93.9
woman 91.7
woodcut 91.4
administration 90.8
several 90.6
gown (clothing) 89.4

Imagga
created on 2020-04-24

old 34.8
binding 30.2
vintage 29.8
ancient 28.5
grunge 26.4
graffito 25.2
antique 25.1
stone 24.3
decoration 23.6
fire screen 23.5
texture 22.2
aged 21.7
retro 21.3
art 21
protective covering 20.3
screen 20.3
memorial 19.8
covering 19.1
history 17
gravestone 16.2
stamp 15.9
paper 15.8
wall 14.7
structure 14.3
temple 14.2
mail 13.4
close 13.1
pattern 13
letter 12.8
postage 12.8
sculpture 12.7
religion 12.5
book jacket 12.4
grungy 12.3
design 12
postmark 11.8
postal 11.8
currency 11.7
carving 11.1
money 11.1
rough 10.9
architecture 10.9
dirty 10.8
frame 10.8
black 10.8
cemetery 10.7
jacket 10.6
travel 10.6
post 10.5
detail 10.5
culture 10.3
envelope 9.9
cash 9.2
device 9
financial 8.9
empty 8.6
blank 8.6
statue 8.6
book 8.4
banking 8.3
note 8.3
historic 8.2
artwork 8.2
global 8.2
material 8
textured 7.9
carved 7.8
museum 7.8
used 7.7
worn 7.6
painted 7.6
finance 7.6
wrapping 7.3
brass 7.3
border 7.2

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

drawing 99.8
sketch 99.1
painting 98.6
cartoon 95.2
text 95.1
child art 91
illustration 66.3
art 64.7
person 62.6
old 53.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 39-57
Gender Female, 52.1%
Happy 45.1%
Angry 47%
Disgusted 45.1%
Confused 45.1%
Calm 46.2%
Surprised 45.1%
Sad 50.8%
Fear 45.6%

AWS Rekognition

Age 49-67
Gender Female, 51.2%
Calm 45.2%
Fear 45.1%
Angry 45.8%
Sad 53.8%
Surprised 45%
Disgusted 45%
Happy 45%
Confused 45.1%

AWS Rekognition

Age 32-48
Gender Male, 54.2%
Surprised 45.3%
Confused 45.1%
Happy 45.1%
Fear 46.3%
Sad 47.7%
Calm 49.4%
Angry 46%
Disgusted 45.1%

AWS Rekognition

Age 40-58
Gender Male, 51.9%
Disgusted 45%
Happy 45.1%
Confused 45%
Surprised 45%
Calm 51.2%
Fear 45%
Angry 45.1%
Sad 48.5%

AWS Rekognition

Age 29-45
Gender Female, 50.5%
Surprised 45.2%
Happy 45.1%
Fear 45.2%
Confused 45.1%
Disgusted 45%
Sad 45.4%
Calm 53.8%
Angry 45.2%

AWS Rekognition

Age 26-40
Gender Male, 53.1%
Angry 45.1%
Calm 45.8%
Surprised 45%
Happy 45%
Fear 45.1%
Disgusted 45%
Sad 53.9%
Confused 45.1%

Feature analysis

Amazon

Person 95.7%
Painting 93%

Categories

Captions