Human Generated Data

Title

Tribute Money

Date

1783

People

Artist: Valentine Green, British 1739 - 1813

Artist after: John Singleton Copley, American 1738 - 1815

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from Harvard University, Gift of John Singleton Copley, G4674

Human Generated Data

Title

Tribute Money

People

Artist: Valentine Green, British 1739 - 1813

Artist after: John Singleton Copley, American 1738 - 1815

Date

1783

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from Harvard University, Gift of John Singleton Copley, G4674

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Person 98.8
Human 98.8
Painting 97.8
Art 97.8
Person 97.8
Person 96.4
Person 95.9
Person 94.5
Person 91.8
Person 81.1
Photo 57.6
Face 57.6
Portrait 57.6
Photography 57.6

Clarifai
created on 2018-02-10

people 99.9
group 99.7
adult 97.1
many 96.2
woman 95.1
group together 94.9
man 94.7
several 93.6
portrait 93
music 92.3
leader 90.4
wear 89.1
administration 87
child 85.2
five 81.4
religion 80.8
musician 80.4
outfit 78.6
facial expression 78.3
art 77.7

Imagga
created on 2018-02-10

tattoo 29.6
black 22
art 19.7
decoration 19.2
design 19.1
person 16.1
face 15.6
portrait 15.5
man 14.9
male 12.8
money 12.8
religion 12.5
adult 12.3
sculpture 11.8
currency 11.7
model 11.7
human 11.2
sexy 11.2
one 11.2
attractive 11.2
church 11.1
cash 11
symbol 10.8
people 10.6
style 10.4
statue 9.7
antique 9.5
ancient 9.5
hair 9.5
culture 9.4
covering 9.3
dollar 9.3
banking 9.2
close 9.1
mask 9.1
pretty 9.1
fashion 9
detail 8.9
finance 8.4
religious 8.4
famous 8.4
old 8.4
dark 8.3
vintage 8.3
bank 8.1
history 8
financial 8
business 7.9
look 7.9
artistic 7.8
architecture 7.8
cathedral 7.7
expression 7.7
god 7.7
figure 7.6
head 7.6
cemetery 7.5
economy 7.4

Google
created on 2018-02-10

Microsoft
created on 2018-02-10

person 98.9
text 95.1
group 64.8
old 56.9
crowd 0.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 57-77
Gender Male, 99.3%
Disgusted 0.4%
Confused 3.3%
Happy 0.5%
Angry 3.7%
Calm 76.6%
Surprised 1.2%
Sad 14.2%

AWS Rekognition

Age 48-68
Gender Male, 97.4%
Surprised 1.7%
Disgusted 1.2%
Confused 1.4%
Happy 1.7%
Calm 85.1%
Sad 6.6%
Angry 2.3%

AWS Rekognition

Age 26-43
Gender Male, 99.6%
Confused 22.4%
Calm 8.1%
Happy 8.8%
Disgusted 29%
Angry 14.9%
Sad 10.2%
Surprised 6.6%

AWS Rekognition

Age 26-43
Gender Female, 57.7%
Happy 0.7%
Sad 4.7%
Calm 84.4%
Confused 2.5%
Angry 2.3%
Disgusted 1.1%
Surprised 4.3%

AWS Rekognition

Age 48-68
Gender Female, 57.7%
Angry 4.3%
Happy 1.1%
Disgusted 0.9%
Confused 1.8%
Sad 40.8%
Calm 48%
Surprised 3.1%

AWS Rekognition

Age 45-66
Gender Male, 95.3%
Sad 66.8%
Disgusted 1.7%
Angry 3.1%
Calm 21.8%
Surprised 1.4%
Happy 2.6%
Confused 2.6%

Microsoft Cognitive Services

Age 59
Gender Male

Microsoft Cognitive Services

Age 32
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Painting 97.8%