Human Generated Data

Title

Ben Shahn, Fogg Art Museum, Harvard University, December 4 to January 19

Date

1956

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Harvard University Art Museum Archives, M23579

Copyright

© Estate of Ben Shahn / Artists Rights Society (ARS), New York

Human Generated Data

Title

Ben Shahn, Fogg Art Museum, Harvard University, December 4 to January 19

People

Artist: Ben Shahn, American 1898 - 1969

Date

1956

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Human 98.7
Drawing 96.9
Art 96.9
Doodle 84.5
Sketch 77.3
Text 72
Person 69.2

Clarifai
created on 2019-10-29

chalk out 98.5
illustration 98.4
people 96
vector 94.5
art 93.5
sketchy 93.1
print 92.3
sketch 90.5
scribble 90.2
retro 90
one 88.1
adult 86.9
no person 86.2
man 86
text 84.2
bill 82
portrait 81.8
freehand 79.6
caricature 79.2
painting 77.7

Imagga
created on 2019-10-29

sketch 100
drawing 100
representation 74.5
art 23.8
cartoon 23.2
map 21.6
black 19.9
design 18.6
outline 17.1
silhouette 16.6
atlas 16
line 13.7
sepia 13.6
geography 13.5
vintage 13.3
graphic 13.1
old 12.5
retro 12.3
antique 12.2
road 11.8
pattern 11.6
wallpaper 11.5
symbol 11.5
man 11.4
plan 11.4
style 11.1
animal 10.6
nation 10.4
travel 9.9
expedition 9.9
boundary 9.9
location 9.8
find 9.8
position 9.8
route 9.8
world 9.8
discovery 9.7
fun 9.7
continent 9.7
states 9.7
navigation 9.6
capital 9.5
paper 9.5
clip art 9.3
planet 9.2
city 9.2
element 9.1
tourism 9.1
gold 9
geographic 8.9
decoration 8.9
navigate 8.8
explore 8.8
guide 8.8
country 8.8
tour 8.7
direction 8.6
journey 8.5
globe 8.3
painting 8.1
card 8
shape 8
icon 7.9
dutch 7.8
comic 7.8
contour 7.8
humor 7.7
path 7.6
note 7.4
love 7.1
curve 7

Google
created on 2019-10-29

Line art 92
Text 90.3
Organ 78.7
Drawing 76.1
Font 75
Illustration 74.9
Art 67.7
Jaw 57.4
Sketch 55.9

Microsoft
created on 2019-10-29

text 99
drawing 98.9
sketch 98.5
cartoon 96.8
illustration 95.3
human face 83.9
person 79
design 70.8
handwriting 67.9
man 52.2
graphic 52.1
clothing 50.1

Face analysis

Amazon

AWS Rekognition

Age 46-64
Gender Male, 95.9%
Calm 47.8%
Sad 22.4%
Angry 4.1%
Surprised 1.4%
Disgusted 1.1%
Happy 0.3%
Fear 8%
Confused 14.8%

Feature analysis

Amazon

Person 69.2%

Captions

Microsoft

a close up of text on a white background 81.9%
a close up of text on a black background 79.8%
close up of text on a white background 79.7%

Text analysis

Amazon

FOGG
HARVARD
ART
ART MUSEUM
MUSEUM
DECEMBER
BEn
FOGG HARVARD UNIVERSiTY
DECEMBER 4 TOJANUARY19
TOJANUARY19
4
UNIVERSiTY
BEn Sah'
Sah'

Google

ART
HARVARD
DE
19
Ben Lah FOGG ART MUSEUM HARVARD UNIVERSITY DE CEMBER4 TO JANUARY 19
Ben
FOGG
MUSEUM
UNIVERSITY
Lah
CEMBER4
TO
JANUARY