Human Generated Data

Title

The Entombment of Christ

Date

17th century

People

Artist: Michelangelo Merisi da Caravaggio, Italian 1571 - 1610

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Edward W. Forbes and the Friends of the Fogg Museum of Art Fund, 1929.335

Human Generated Data

Title

The Entombment of Christ

People

Artist: Michelangelo Merisi da Caravaggio, Italian 1571 - 1610

Date

17th century

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Edward W. Forbes and the Friends of the Fogg Museum of Art Fund, 1929.335

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Person 91.5
Human 91.5
Art 87.7
Painting 83.4
Person 63.8
Apparel 62.7
Clothing 62.7
People 61.6
Female 60.2
Person 59.8
Face 59.3
Photo 56.7
Portrait 56.7
Photography 56.7

Clarifai
created on 2020-04-24

people 100
adult 99
art 98.6
group 98.2
man 96.5
two 96.4
woman 96.3
illustration 95.6
print 95.2
wear 92.2
interaction 91.9
administration 90.9
painting 90.6
leader 89.7
war 89.4
child 88
portrait 87.3
furniture 86.6
three 86.6
baby 86.3

Imagga
created on 2020-04-24

sculpture 51.4
statue 48.1
religion 34.1
art 32.1
temple 30.9
ancient 27.7
stone 27.3
god 24.9
culture 24.8
old 24.4
architecture 24.2
monument 22.4
history 22.4
carving 21.1
religious 19.7
case 19
travel 19
figure 18.3
famous 15.8
holy 14.4
spirituality 14.4
tourism 14
church 13.9
antique 13.8
pray 13.6
spiritual 13.4
fountain 13
historic 12.8
worship 12.6
face 12.1
historical 11.3
device 11.2
shop 11
marble 10.8
building 10.7
meditation 10.5
faith 10.5
detail 10.5
city 10
landmark 9.9
carved 9.8
prayer 9.7
structure 9.6
decoration 9.5
barbershop 9.4
east 9.3
potter's wheel 9.3
gold 9
bible 8.8
belief 8.8
mercantile establishment 8.7
facade 8.7
oriental 8.5
machine 8.4
head 8.4
tourist 8.3
traditional 8.3
peace 8.2
man 8.1
wheel 8
window 7.9
mythology 7.9
artistic 7.8
knocker 7.8
wall 7.7
cathedral 7.7
book jacket 7.7
comic book 7.6
vintage 7.4
water 7.3
bronze 7

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

text 99.6
person 96.6
painting 96.5
old 95.5
drawing 94.3
book 90.6
sketch 87.5
human face 86.5
window 84.6
cartoon 75.9
posing 63.4
poster 55.8
vintage 48.3
family 22.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 21-33
Gender Female, 96.4%
Surprised 0.8%
Sad 92.6%
Happy 0.1%
Disgusted 0.1%
Angry 0.4%
Calm 4.1%
Fear 1.2%
Confused 0.8%

AWS Rekognition

Age 34-50
Gender Male, 92.4%
Fear 0.8%
Surprised 0.8%
Calm 13.8%
Confused 3.3%
Disgusted 1.7%
Happy 2%
Sad 75.5%
Angry 2.1%

AWS Rekognition

Age 20-32
Gender Female, 81.7%
Angry 18.9%
Calm 24.5%
Disgusted 0.6%
Happy 4.4%
Fear 0.8%
Confused 3.6%
Surprised 1.5%
Sad 45.6%

AWS Rekognition

Age 22-34
Gender Male, 89.9%
Sad 1.8%
Fear 0.1%
Confused 0.5%
Angry 0.3%
Surprised 0.5%
Happy 4.9%
Calm 91.6%
Disgusted 0.2%

AWS Rekognition

Age 23-35
Gender Female, 92%
Sad 56.7%
Calm 30.1%
Disgusted 2.1%
Angry 5%
Confused 1.5%
Surprised 1.5%
Happy 1.6%
Fear 1.4%

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 91.5%
Painting 83.4%

Categories