Human Generated Data

Title

Mother and child

Date

c. 1924

People

Artist: Elsa Schmid, American 1897-1970

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Peter G. and Elizabeth S. Neumann, 2009.192

Human Generated Data

Title

Mother and child

People

Artist: Elsa Schmid, American 1897-1970

Date

c. 1924

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Peter G. and Elizabeth S. Neumann, 2009.192

Machine Generated Data

Tags

Amazon
created on 2019-04-04

Art 98.9
Human 97.6
Drawing 97.6
Person 87.3
Sketch 85.5
Painting 68.3
Doodle 57.6

Clarifai
created on 2018-03-23

people 99.8
print 99.1
art 99.1
man 98.8
engraving 98.6
one 98.4
illustration 98.3
adult 97.7
woodcut 97.2
portrait 96.6
veil 95.1
group 92.8
two 92.5
wear 92
ancient 91.9
old 91.8
visuals 91.7
sculpture 91.5
religion 91.3
painting 91.1

Imagga
created on 2018-03-23

sketch 70.4
drawing 55.8
representation 41.5
glass 21.5
old 19.5
drink 19.2
paper 18.8
container 17.8
design 16.9
art 16.3
water 14
transparent 13.4
object 13.2
liquid 13
drop 12.7
pour 12.6
black 12.6
vintage 12.4
cup 12.4
antique 12.3
retro 12.3
bubble 12.2
vase 12
close 12
currency 11.7
vessel 11.5
bubbles 11.4
ancient 11.2
ornament 11.2
texture 11.1
money 11.1
beverage 11
finance 11
decoration 10.5
pattern 10.3
pencil 10.2
light 10.1
cash 10.1
wealth 9.9
detail 9.7
alcohol 9.6
architecture 9.4
grunge 9.4
dollar 9.3
classic 9.3
wet 8.9
reflection 8.9
financial 8.9
celebration 8.8
clear 8.7
artistic 8.7
cold 8.6
exchange 8.6
motion 8.6
ice 8.5
banking 8.3
historic 8.2
single 8.2
style 8.2
symbol 8.1
graphic 8
shape 7.9
business 7.9
color 7.8
luxury 7.7
stone 7.7
bill 7.6
clean 7.5
splash 7.5
freshness 7.5
savings 7.5
backdrop 7.4
flow 7.4
bar 7.4
bank 7.2
shiny 7.1
market 7.1

Google
created on 2018-03-23

art 84.7
drawing 78.4
illustration 70.4
black and white 67.8
artwork 66.6
still life 65.3
visual arts 56.3
printmaking 54.9
font 50.7

Microsoft
created on 2018-03-23

text 99.9
book 99.8

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 35-52
Gender Male, 95.4%
Angry 55.4%
Disgusted 1.5%
Happy 3.4%
Confused 6.8%
Sad 6%
Calm 11.5%
Surprised 15.5%

Microsoft Cognitive Services

Age 38
Gender Male

Feature analysis

Amazon

Person 87.3%
Painting 68.3%

Categories

Captions

Microsoft
created on 2018-03-23

a close up of a book 45.5%
close up of a book 40.6%
a close up of a book cover 40.5%

Text analysis

Amazon

E
E Schl.id
12/2.
Schl.id