Human Generated Data

Title

The Powderly Circular: Cyrus McCormick & Terence V. Powderly (Haymarket Series); verso: rough sketch

Date

1934-1935

People

Artist: Mitchell Siporin, American 1910 - 1976

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Miriam, Rachel, and Judith Siporin, 1997.25

Human Generated Data

Title

The Powderly Circular: Cyrus McCormick & Terence V. Powderly (Haymarket Series); verso: rough sketch

People

Artist: Mitchell Siporin, American 1910 - 1976

Date

1934-1935

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Miriam, Rachel, and Judith Siporin, 1997.25

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Human 99
Person 99
Art 96.1
Drawing 96.1
Person 93.4
Sketch 89.1
Hand 82.5
Text 56.8

Clarifai
created on 2018-03-16

people 99.8
adult 99.2
man 99
print 98.3
two 98.1
art 97.9
one 97.4
illustration 96.4
portrait 95.8
group 94
painting 92.7
leader 92
wear 89.5
woman 89.4
engraving 88.9
three 87.3
religion 82.2
veil 79.3
Renaissance 78.4
four 78

Imagga
created on 2018-03-16

sketch 86
drawing 66
representation 53.9
art 36.1
decoration 35.7
tattoo 35.6
religion 26.9
design 25.1
religious 20.6
culture 18.8
sculpture 18
ancient 17.3
church 16.6
architecture 16.4
temple 16.1
statue 15.3
graffito 15.3
artwork 13.7
face 13.5
traditional 13.3
old 13.2
god 12.4
detail 12.1
body 12
stone 11.8
vintage 11.6
retro 11.5
man 11.4
girls 10.9
head 10.9
history 10.7
holy 10.6
spirituality 10.6
antique 10.4
fashion 9.8
black 9.6
faith 9.6
artistic 9.6
people 9.5
travel 9.2
human 9
carving 8.9
pattern 8.9
stamp 8.8
mask 8.8
symbol 8.8
close 8.6
figure 8.6
money 8.5
oriental 8.5
portrait 8.4
monument 8.4
decorative 8.4
color 8.3
tourism 8.3
sensuality 8.2
style 8.2
dress 8.1
comic book 8.1
currency 8.1
hair 7.9
women 7.9
postmark 7.9
belief 7.8
model 7.8
mysterious 7.8
wall 7.7
spiritual 7.7
mail 7.7
painted 7.6
one 7.5
letter 7.3
cash 7.3
painting 7.2
holiday 7.2

Google
created on 2018-03-16

man 91.7
art 90.8
human behavior 83.8
drawing 71.4
arm 69.9
illustration 66.2
history 65.2
muscle 63.9
visual arts 54.9
artwork 53.3

Microsoft
created on 2018-03-16

text 99.9
book 99
woman 97.6
person 96.2
posing 87.6
black 69.9
female 35.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Male, 94.7%
Angry 2.7%
Sad 44.6%
Calm 43.9%
Disgusted 1.4%
Happy 1.4%
Surprised 1.6%
Confused 4.5%

AWS Rekognition

Age 26-43
Gender Male, 84.5%
Disgusted 2%
Surprised 13.6%
Happy 6.3%
Confused 21.2%
Calm 36%
Sad 14.1%
Angry 6.8%

AWS Rekognition

Age 35-52
Gender Male, 95.9%
Angry 2.2%
Sad 20.8%
Happy 0.3%
Surprised 0.7%
Confused 17.7%
Disgusted 0.6%
Calm 57.9%

Microsoft Cognitive Services

Age 71
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Categories

Imagga

paintings art 99.9%
interior objects 0.1%