Human Generated Data

Title

Woman with Sledge Hammer

Date

1917

People

Artist: André Louis Armand Rassenfosse, Belgian 1862 - 1934

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous loan, 62.2005

Human Generated Data

Title

Woman with Sledge Hammer

People

Artist: André Louis Armand Rassenfosse, Belgian 1862 - 1934

Date

1917

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous loan, 62.2005

Machine Generated Data

Tags

Amazon
created on 2019-11-01

Person 96.7
Human 96.7
Art 93
Drawing 78.2
Sketch 66.2
Painting 64.9
Apparel 60.4
Hat 60.4
Clothing 60.4
Photography 60
Photo 60
Face 56.2
Portrait 56.2

Clarifai
created on 2019-11-01

art 99.3
illustration 98.9
print 98.8
people 98.5
painting 98.5
adult 97
one 95.4
vintage 95.4
portrait 94.8
retro 94.6
woman 93.8
man 93.2
veil 91.9
old 91.8
antique 88.5
chalk out 86.9
ancient 86.8
culture 83
paper 81.4
wear 79

Imagga
created on 2019-11-01

sketch 97.4
drawing 70.6
representation 60.3
art 29.3
religion 28.7
sculpture 23.1
statue 20.8
ancient 20.8
currency 20.7
money 20.4
culture 19.7
book jacket 19
cash 18.3
temple 18.2
god 18.2
bill 16.2
history 16.1
dollar 15.8
old 15.3
vintage 14.9
architecture 14.9
church 14.8
jacket 14.8
holy 14.5
tile 14.4
paper 14.2
religious 14.1
banking 13.8
bank 13.4
antique 13.3
finance 12.7
decoration 12.6
ruler 12.5
spirituality 12.5
financial 11.6
wrapping 11.2
historic 11
close 10.9
wealth 10.8
man 10.8
dollars 10.6
travel 10.6
one 10.5
golden 10.3
savings 10.3
economy 10.2
stone 10.1
mosaic 10.1
museum 10
business 9.7
exchange 9.6
historical 9.4
grunge 9.4
design 9.1
portrait 9.1
banknote 8.7
face 8.5
monument 8.4
investment 8.3
carving 8.1
symbol 8.1
prophet 7.9
masterpiece 7.9
figure 7.9
bible 7.8
banknotes 7.8
century 7.8
person 7.8
painter 7.8
covering 7.7
spiritual 7.7
rich 7.5
famous 7.4
style 7.4
gold 7.4
retro 7.4
icon 7.1

Google
created on 2019-11-01

Art 70.9
Painting 68.1
Drawing 63.2
Artwork 52.3

Microsoft
created on 2019-11-01

text 99.9
sketch 99.7
drawing 99.7
book 98.6
art 96.7
child art 90
human face 80.1
illustration 79.5
painting 69.5
person 61.2
cartoon 54.1
ink 53.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 21-33
Gender Female, 95.7%
Confused 0.3%
Happy 0.4%
Sad 5.1%
Calm 93.2%
Angry 0.4%
Surprised 0.4%
Disgusted 0%
Fear 0.2%

Microsoft Cognitive Services

Age 24
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.7%
Painting 64.9%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2019-11-01

a close up of a book 42.3%
close up of a book 36.7%
a hand holding a book 36.6%

Text analysis

Amazon

1917
TE 1917
TE

Google

TRa ssanfise 1917
TRa
ssanfise
1917