Human Generated Data

Title

Great Crucifixion (center)

Date

1589

People

Artist: Agostino Carracci, Italian 1557 - 1602

Artist after: Jacopo Tintoretto, Italian 1519 - 1594

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of William Gray from the collection of Francis Calley Gray, G664.2

Human Generated Data

Title

Great Crucifixion (center)

People

Artist: Agostino Carracci, Italian 1557 - 1602

Artist after: Jacopo Tintoretto, Italian 1519 - 1594

Date

1589

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-04-06

Human 98.1
Person 98.1
Person 97.7
Art 96.5
Painting 95.1
Person 95
Person 94.1
Person 89.1
Person 88.9
Person 80.6
Person 79.5
Horse 79.2
Mammal 79.2
Animal 79.2
Symbol 77.6
Cross 76.2
Person 68.9
Person 66.6
Crucifix 57.7
Person 53.9

Clarifai
created on 2018-04-19

people 100
many 99.8
adult 99.6
print 99.6
group 99.5
military 98.9
soldier 98.8
engraving 98.7
pain 97.4
art 97
man 96.4
war 96.4
weapon 95.3
illustration 95.1
skirmish 94.1
slavery 93.1
kneeling 91.7
battle wound 91.2
injury 90.7
disorder 90.6

Imagga
created on 2018-04-19

sketch 100
drawing 82
representation 60
art 32.4
comic book 30.2
vintage 21.5
old 19.5
history 18.8
retro 18
ancient 17.3
decoration 16.7
religion 16.1
antique 15.8
grunge 15.3
detail 15.3
design 14.3
sculpture 14
church 13.9
statue 13.8
temple 13.4
architecture 13.4
pattern 13
culture 12.8
stamp 12.6
paper 12.6
print media 12.1
god 11.5
letter 11
postmark 10.8
painting 10.8
black 10.8
postage 10.8
postal 10.8
structure 10.8
gold 10.7
mail 10.5
wall 10.5
texture 10.4
painted 9.5
ornament 9.5
golden 9.5
symbol 9.4
religious 9.4
artwork 9.2
book jacket 9.1
icon 8.7
artistic 8.7
holy 8.7
post 8.6
travel 8.5
fountain 8.4
city 8.3
stone 8.2
paint 8.2
currency 8.1
cool 8
building 7.9
spirituality 7.7
decorative 7.5
window 7.5
frame 7.5
glass 7.5
monument 7.5
style 7.4
man 7.4
shape 7.4
backgrounds 7.3
aged 7.2
color 7.2
landmark 7.2
carving 7.2
jacket 7.1
textured 7

Google
created on 2018-04-19

Microsoft
created on 2018-04-19

text 100
book 100
old 64
vintage 25.1

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Female, 50.3%
Happy 45%
Confused 45.1%
Calm 45%
Disgusted 54.6%
Angry 45.1%
Sad 45.1%
Surprised 45.1%

AWS Rekognition

Age 20-38
Gender Female, 53.2%
Confused 45.9%
Angry 45.9%
Calm 45.8%
Happy 45.4%
Surprised 45.8%
Disgusted 45.6%
Sad 50.7%

AWS Rekognition

Age 23-38
Gender Female, 53.2%
Angry 45.5%
Calm 47.6%
Happy 45.3%
Disgusted 45.9%
Surprised 45.4%
Sad 50.1%
Confused 45.3%

AWS Rekognition

Age 23-38
Gender Female, 53.5%
Confused 45.7%
Calm 46.2%
Happy 47.2%
Sad 46.5%
Disgusted 46.8%
Surprised 46.2%
Angry 46.4%

Feature analysis

Amazon

Person 98.1%
Painting 95.1%
Horse 79.2%

Captions

Microsoft

a vintage photo of a person 79.8%
a vintage photo of a book 57.8%
a vintage photo of a group of people looking at a book 40.9%

Text analysis

Amazon

PI.NRL
Larile
Omi nlo,
Omi nlo, Larile lachrirer
lachrirer
Gubns