Human Generated Data

Title

Hagar and Ishmael

Date

1798

People

Artist: Robert Dunkarton, British 1744 - before 1817

Artist after: John Singleton Copley, American 1738 - 1815

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from Harvard University, Gift of Gardiner Greene, G4404

Human Generated Data

Title

Hagar and Ishmael

People

Artist: Robert Dunkarton, British 1744 - before 1817

Artist after: John Singleton Copley, American 1738 - 1815

Date

1798

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from Harvard University, Gift of Gardiner Greene, G4404

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Painting 99
Art 99
Person 97.6
Human 97.6
Person 79.7

Clarifai
created on 2018-02-10

people 99.9
art 99.8
adult 99.4
painting 98.6
woman 98.4
group 97.1
print 96.8
religion 96.4
two 96.4
baroque 96.1
man 95.7
nude 94.4
illustration 94.4
Renaissance 93.3
one 92.5
cupid 91.4
furniture 91.4
saint 91
engraving 90.2
portrait 90.1

Imagga
created on 2018-02-10

cadaver 35.3
column 30.2
statue 29.8
sculpture 27.6
art 23.7
person 20.4
ancient 17.3
adult 16.8
stone 16.2
religion 16.1
attractive 16.1
model 15.6
old 15.3
sexy 15.3
portrait 14.9
decoration 14.3
face 14.2
people 14
man 13.5
god 13.4
figure 13.3
fashion 12.8
dress 12.7
architecture 12.5
culture 12
hair 11.9
male 11.4
travel 11.3
antique 11.3
religious 11.3
monument 11.2
body 11.2
tourism 10.7
style 10.4
carving 10.2
sensuality 10
design 9.9
pretty 9.8
human 9.8
one 9.7
marble 9.7
historical 9.4
fountain 9.4
tattoo 9.1
black 9
history 9
holy 8.7
spiritual 8.6
structure 8.6
world 8.4
dark 8.4
city 8.3
vintage 8.3
historic 8.3
makeup 8.2
lady 8.1
skin 8.1
detail 8.1
posing 8
love 7.9
catholic 7.8
wall 7.7
mystery 7.7
grunge 7.7
jacket 7.6
elegance 7.6
famous 7.5
church 7.4
temple 7.4
artwork 7.3
fantasy 7.2

Google
created on 2018-02-10

Microsoft
created on 2018-02-10

text 99.8
book 91.2
person 89.9
old 68.6
posing 51.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 14-23
Gender Male, 58.3%
Sad 24.1%
Happy 3.2%
Angry 33%
Calm 10%
Surprised 7.8%
Confused 15.5%
Disgusted 6.4%

AWS Rekognition

Age 20-38
Gender Female, 56.6%
Happy 1.3%
Surprised 3.8%
Angry 7.4%
Calm 12.7%
Disgusted 11.2%
Sad 55%
Confused 8.5%

AWS Rekognition

Age 23-38
Gender Female, 95.5%
Calm 47.9%
Surprised 17.6%
Sad 4%
Angry 6.7%
Disgusted 8.4%
Confused 11.5%
Happy 4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 99%
Person 97.6%