Human Generated Data

Title

Monmouth before James II

Date

c. 1795

People

Artist: John Singleton Copley, American 1738 - 1815

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Louise E. Bettens Fund, 1917.67

Human Generated Data

Title

Monmouth before James II

People

Artist: John Singleton Copley, American 1738 - 1815

Date

c. 1795

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Louise E. Bettens Fund, 1917.67

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Painting 99.7
Art 99.7
Person 96.8
Human 96.8
Person 94.8
Person 92.8
Person 88.7
Person 87.6
Person 86.4
Person 86.3
Person 85
Person 63.8
Person 63.2
Leisure Activities 58.1
Person 48.9

Clarifai
created on 2018-03-16

people 99.8
art 99.6
group 98.8
painting 98.4
adult 98.1
religion 97.3
woman 97.3
man 96.8
dancer 96
Renaissance 94.5
child 94.2
dancing 94
kneeling 93.2
son 92.1
baby 91.7
royalty 91
wear 90.8
veil 90.6
gown (clothing) 90.3
position 88.8

Imagga
created on 2018-03-16

religion 26
throne 22.7
religious 22.5
temple 21.9
culture 21.4
traditional 19.9
art 18.3
metropolitan 18.2
chair of state 18.2
person 18.1
costume 17
spiritual 16.3
old 16
palanquin 15.8
gold 15.6
faith 15.3
chair 14.9
travel 14.8
golden 14.6
god 14.3
tradition 13.8
holy 13.5
church 12.9
people 12.8
color 12.8
belief 12.6
litter 12.6
spirituality 12.5
decoration 12.3
colorful 12.2
ancient 12.1
man 12.1
musical instrument 11.9
portrait 11.6
pray 11.6
worship 11.6
statue 11.4
antique 11.2
sculpture 11
architecture 10.9
sacred 10.7
drum 10.7
east 10.3
historic 10.1
fan 10
holiday 10
seat 9.9
male 9.9
tourism 9.9
prayer 9.7
saint 9.6
percussion instrument 9.6
adult 9.5
conveyance 9.5
historical 9.4
dancer 9.1
bangkok 8.8
bible 8.8
cultural 8.8
palace 8.7
festival 8.6
tourist 8.2
follower 8.1
interior 8
oriental 7.9
animal 7.9
face 7.8
scene 7.8
bull 7.7
mask 7.7
meditation 7.7
monument 7.5
catholic 7.3
detail 7.2
dress 7.2
history 7.2

Google
created on 2018-03-16

art 88.8
painting 77.3
middle ages 61
history 59.5

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 16-27
Gender Male, 53.3%
Disgusted 46.3%
Sad 46.4%
Calm 49.2%
Surprised 45.6%
Happy 46.9%
Confused 45.2%
Angry 45.5%

AWS Rekognition

Age 20-38
Gender Female, 50.4%
Sad 47%
Calm 47.9%
Angry 46.6%
Surprised 45.4%
Confused 45.4%
Happy 46%
Disgusted 46.8%

AWS Rekognition

Age 45-66
Gender Female, 51.4%
Sad 54.9%
Disgusted 45%
Surprised 45%
Calm 45%
Angry 45%
Happy 45%
Confused 45%

AWS Rekognition

Age 30-47
Gender Female, 53.2%
Surprised 45.1%
Disgusted 45.1%
Calm 45.1%
Confused 45.1%
Angry 45.2%
Happy 45.1%
Sad 54.3%

AWS Rekognition

Age 11-18
Gender Female, 52.7%
Disgusted 45.1%
Angry 45.4%
Calm 47%
Sad 52%
Happy 45.2%
Confused 45.2%
Surprised 45.1%

AWS Rekognition

Age 20-38
Gender Male, 50%
Angry 49.5%
Disgusted 49.5%
Calm 50.1%
Happy 49.6%
Sad 49.7%
Surprised 49.5%
Confused 49.5%

AWS Rekognition

Age 14-23
Gender Male, 52.2%
Calm 45%
Sad 45%
Confused 45%
Disgusted 54.9%
Happy 45%
Surprised 45%
Angry 45.1%

AWS Rekognition

Age 19-36
Gender Female, 50.3%
Calm 49.6%
Sad 49.8%
Confused 49.5%
Happy 49.6%
Disgusted 49.6%
Angry 49.8%
Surprised 49.6%

Microsoft Cognitive Services

Age 11
Gender Female

Feature analysis

Amazon

Painting 99.7%
Person 96.8%