Human Generated Data

Title

Peasants Dancing Among Ruins

Date

17th century

People

Artist: Gerrit de Heer, Dutch 1606 - 1652

Classification

Drawings

Credit Line

The Maida and George Abrams Collection, Fogg Art Museum, Harvard University, Cambridge, Massachusetts, Promised Gift, 7.2019.13

Human Generated Data

Title

Peasants Dancing Among Ruins

People

Artist: Gerrit de Heer, Dutch 1606 - 1652

Date

17th century

Classification

Drawings

Credit Line

The Maida and George Abrams Collection, Fogg Art Museum, Harvard University, Cambridge, Massachusetts, Promised Gift, 7.2019.13

Machine Generated Data

Tags

Amazon
created on 2019-09-25

Art 99.8
Painting 99.8
Human 99.7
Person 99.7
Person 99.2
Person 98.8
Person 98.5
Person 92.6
Person 81.7
Person 68.5
Person 63.4

Clarifai
created on 2019-09-25

people 99.9
adult 99.5
group 99.1
painting 98.8
man 97.2
woman 96.5
art 96.3
calamity 95.1
home 94
child 93.6
religion 92.4
one 92.1
battle 91.8
weapon 91.1
war 90.7
two 90.5
mammal 89.8
town 89.5
position 89.2
many 88.6

Imagga
created on 2019-09-25

brick 83.8
building material 61
ancient 41.6
wall 40
architecture 39.3
travel 35.3
stone 33.1
old 31.4
temple 27.6
building 27.4
tourism 27.3
history 26
landmark 22.6
monument 21.5
famous 20.5
ruins 20.5
historic 20.2
sky 19.2
culture 18.8
ruin 18.5
tourist 18.3
city 18.3
historical 17.9
desert 17.8
grave 17.6
sand 15.2
fortress 14.4
religion 14.4
roof 14.1
antique 13.9
construction 13.7
structure 13.6
tower 13.4
landscape 13.4
vacation 13.1
past 12.6
house 12.4
town 12.1
hill 11.3
vault 10.3
place 10.3
art 9.9
civilization 9.8
archeology 9.8
heritage 9.7
rock 9.6
castle 9.4
clouds 9.3
traditional 9.2
aged 9.1
mountain 8.9
grunge 8.5
site 8.5
religious 8.4
exterior 8.3
fortification 8.3
outdoors 8.2
fort 7.9
walls 7.8
scene 7.8
medieval 7.7
buildings 7.6
park 7.5
village 7.3
holiday 7.2
river 7.1
sandstone 7

Google
created on 2019-09-25

Microsoft
created on 2019-09-25

drawing 97.9
person 92.3
outdoor 85.3
old 81.4
art 67.1
clothing 60.5
stone 41
painting 16.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 36-52
Gender Male, 52%
Sad 49.4%
Calm 49.3%
Confused 45.1%
Disgusted 45.1%
Fear 45%
Happy 45.9%
Angry 45.1%
Surprised 45.1%

AWS Rekognition

Age 17-29
Gender Female, 52.5%
Surprised 45.1%
Calm 45.1%
Sad 45.9%
Fear 45.1%
Happy 53.4%
Disgusted 45%
Confused 45%
Angry 45.5%

AWS Rekognition

Age 16-28
Gender Female, 53.9%
Happy 45.4%
Disgusted 45.1%
Sad 47.6%
Angry 45.4%
Confused 45%
Calm 50.9%
Fear 45.5%
Surprised 45.1%

AWS Rekognition

Age 37-55
Gender Male, 50.5%
Sad 50.1%
Disgusted 49.5%
Fear 49.5%
Calm 49.8%
Angry 49.5%
Surprised 49.5%
Confused 49.5%
Happy 49.5%

AWS Rekognition

Age 14-26
Gender Male, 53.2%
Angry 45.1%
Happy 45.1%
Disgusted 45%
Sad 50.6%
Fear 45%
Surprised 45%
Calm 49.1%
Confused 45.1%

Microsoft Cognitive Services

Age 71
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 99.8%
Person 99.7%