Human Generated Data

Title

The Ransom

Date

c. 1860

People

Artist: John Everett Millais, British 1829 - 1896

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Grenville L. Winthrop, 1943.539

Human Generated Data

Title

The Ransom

People

Artist: John Everett Millais, British 1829 - 1896

Date

c. 1860

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Grenville L. Winthrop, 1943.539

Machine Generated Data

Tags

Amazon
created on 2023-10-18

Art 100
Painting 100
Person 99.1
Person 97.7
Person 97.1
Adult 97.1
Female 97.1
Woman 97.1
Person 95.5
Person 94.8
Person 93.6
Face 87.7
Head 87.7
Photography 56.9
Portrait 56.9

Clarifai
created on 2023-10-18

people 99.9
art 99.5
painting 99.5
print 99.4
group 99.1
veil 99
woman 98.7
child 98.7
man 98.6
wear 98.3
son 98.3
adult 98.3
lid 97.4
illustration 97.3
baby 96.2
religion 95.5
costume 94.7
dress 93.9
weapon 93.1
flute 92.9

Imagga
created on 2018-12-18

man 29.6
person 23.8
male 23.4
people 22.3
weapon 19.3
gun 19.1
soldier 18.6
old 16
oriental 15.8
military 15.4
art 15.4
mask 14.9
uniform 14.9
culture 14.5
portrait 13.6
costume 13.3
camouflage 13.3
clothing 12.9
face 12.8
dress 12.6
war 12.6
traditional 12.5
adult 12.4
church 12
religion 11.6
guy 11.2
army 10.7
happy 10.6
holy 10.6
saint 10.6
faith 10.5
god 10.5
group 10.5
religious 10.3
protection 10
bible 9.8
lady 9.7
prayer 9.7
child 9.6
couple 9.6
antique 9.5
color 9.5
hat 9.4
leisure 9.1
vintage 9.1
fashion 9
attendant 8.9
warrior 8.8
love 8.7
spiritual 8.6
holiday 8.6
danger 8.2
style 8.2
paint 8.1
equipment 8.1
decoration 8.1
detail 8
colorful 7.9
together 7.9
catholic 7.8
men 7.7
attractive 7.7
cathedral 7.7
outdoors 7.5
figure 7.4
tradition 7.4
covering 7.2
design 7.2
activity 7.2
history 7.2
air gun 7.1
mother 7.1
romantic 7.1
kin 7.1

Google
created on 2018-12-18

Microsoft
created on 2018-12-18

person 98.2
people 69.4
painting 18.9
art 6.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-10
Gender Female, 100%
Sad 100%
Surprised 6.4%
Fear 6%
Confused 1%
Calm 0.2%
Disgusted 0.2%
Angry 0.1%
Happy 0.1%

AWS Rekognition

Age 18-26
Gender Male, 67.4%
Calm 99.8%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Disgusted 0%
Angry 0%
Confused 0%
Happy 0%

AWS Rekognition

Age 39-47
Gender Male, 54.1%
Calm 95.4%
Surprised 6.7%
Fear 6.5%
Sad 2.3%
Confused 0.5%
Angry 0.4%
Happy 0.3%
Disgusted 0.3%

AWS Rekognition

Age 23-31
Gender Male, 98.5%
Calm 69.9%
Fear 16.9%
Surprised 7.3%
Angry 3.8%
Sad 2.9%
Disgusted 2.1%
Happy 1.5%
Confused 0.9%

AWS Rekognition

Age 38-46
Gender Male, 99.9%
Calm 90.5%
Surprised 6.7%
Fear 6.2%
Confused 6%
Sad 2.3%
Angry 0.7%
Disgusted 0.5%
Happy 0.2%

Microsoft Cognitive Services

Age 49
Gender Male

Microsoft Cognitive Services

Age 23
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Adult 97.1%
Female 97.1%
Woman 97.1%

Categories

Imagga

people portraits 61.8%
events parties 33.5%
pets animals 3.3%