Human Generated Data

Title

Henry Laurens

Date

October 1, 1782

People

Artist: Valentine Green, British 1739 - 1813

Artist after: John Singleton Copley, American 1738 - 1815

Publisher: John Joseph Stockdale, British 1770 - 1847

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from Harvard University, Gift of John Singleton Copley, G4676

Human Generated Data

Title

Henry Laurens

People

Artist: Valentine Green, British 1739 - 1813

Artist after: John Singleton Copley, American 1738 - 1815

Publisher: John Joseph Stockdale, British 1770 - 1847

Date

October 1, 1782

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from Harvard University, Gift of John Singleton Copley, G4676

Machine Generated Data

Tags

Amazon
created on 2019-03-29

Person 99.5
Human 99.5
Art 97.8
Painting 79.6

Clarifai
created on 2018-02-09

people 100
adult 98.8
print 97.9
one 97.4
furniture 96.9
group 96.7
leader 95
home 94.4
man 94
administration 93.3
seat 92.4
wear 92
vehicle 90.3
two 90.2
sit 89.8
engraving 89
military 88.2
outfit 87.2
portrait 86.5
elderly 86.5

Imagga
created on 2018-02-09

dress 22.6
portrait 18.8
clothing 17.1
art 16.5
person 16
fashion 15.8
people 15.6
culture 15.4
statue 15.1
old 14.6
traditional 14.1
man 13.4
lady 13
face 12.8
adult 12.6
religion 12.5
black 12.2
antique 12.1
ancient 12.1
posing 11.5
sculpture 11.1
robe 10.3
religious 10.3
clothes 10.3
monument 10.3
costume 10.2
armor 10.2
architecture 10.2
historic 10.1
elegance 10.1
male 10
attractive 9.8
covering 9.7
outfit 9.7
hair 9.5
golden 9.5
historical 9.4
wall 9.4
model 9.3
makeup 9.2
kimono 9.1
vintage 9.1
decoration 8.9
mask 8.8
holy 8.7
garment 8.6
luxury 8.6
pretty 8.4
color 8.3
holding 8.3
tourism 8.2
human 8.2
style 8.2
sexy 8
building 8
theater 7.8
marble 7.7
sax 7.7
war 7.7
faith 7.7
god 7.7
head 7.6
blond 7.5
mother 7.4
gold 7.4
tradition 7.4
tourist 7.4
make 7.3
world 7.2
history 7.2
interior 7.1

Google
created on 2018-02-09

Microsoft
created on 2018-02-09

text 92.7
outdoor 87.6
person 87.2
old 85.9
black 65

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 48-68
Gender Male, 93.9%
Happy 0.6%
Sad 19%
Disgusted 1.3%
Angry 3.7%
Confused 3.8%
Calm 69%
Surprised 2.6%

AWS Rekognition

Age 45-66
Gender Female, 50.4%
Surprised 49.6%
Disgusted 49.6%
Happy 49.6%
Calm 49.5%
Sad 49.8%
Angry 49.9%
Confused 49.5%

Microsoft Cognitive Services

Age 52
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Painting 79.6%

Captions

Microsoft
created on 2018-02-09

a vintage photo of a man 88.3%
an old photo of a man 88.2%
old photo of a man 86.6%