Human Generated Data

Title

Ben Shahn, Fogg Art Museum, Harvard University, December 4 to January 19

Date

1956

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Harvard University Art Museum Archives, M23580

Copyright

© Estate of Ben Shahn / Artists Rights Society (ARS), New York

Human Generated Data

Title

Ben Shahn, Fogg Art Museum, Harvard University, December 4 to January 19

People

Artist: Ben Shahn, American 1898 - 1969

Date

1956

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Human 97.9
Art 93.7
Drawing 93.7
Text 78.9
Doodle 73.3
Label 69.4
Person 67.1
Sketch 60.8

Clarifai
created on 2019-10-29

illustration 97.4
people 97.2
chalk out 95.1
art 91.9
print 91.6
text 89.8
bill 89.5
vector 88.5
man 88.5
adult 88.4
retro 87
portrait 86.9
one 82.8
vintage 82.6
sketchy 80.7
sketch 80.4
woman 78.8
design 78.6
scribble 76.5
card 75.2

Imagga
created on 2019-10-29

drawing 49.8
sketch 37
representation 33.5
cone 25.3
cartoon 25
art 21.6
map 19.5
decoration 19
design 18
black 17.4
card 14.8
diagram 14.6
line 14.6
silhouette 14.1
holiday 12.9
jelly 12.7
pattern 12.3
symbol 12.1
graphic 11.7
love 11.1
atlas 10.9
shape 10.9
element 10.7
style 10.4
celebration 10.4
icon 10.3
greeting 10.2
clip art 10.2
substance 10.2
planet 9.8
comic 9.7
outline 9.5
color 9.5
painting 9.2
wallpaper 9.2
animal 9
fun 9
humor 8.7
ornament 8.6
character 8.5
decorative 8.4
retro 8.2
world 8
paper 7.9
season 7.8
geography 7.7
vintage 7.6
happy 7.5
page 7.4
gold 7.4
man 7.4
heart 7.4
funny 7.3
template 7.3
object 7.3
artwork 7.3
new 7.3
romance 7.1

Google
created on 2019-10-29

Text 88.1
Cartoon 86.9
Illustration 70.1
Font 68.6
Art 58.1
Vegetarian food 56.6

Microsoft
created on 2019-10-29

drawing 98.5
cartoon 98.4
text 97.9
sketch 95.4
illustration 95.1
human face 83.8
person 79.8
child art 76.2
poster 76
graphic 72.4
design 69.6
clothing 65.2
art 59.8

Face analysis

Amazon

AWS Rekognition

Age 35-51
Gender Male, 95.1%
Sad 19.5%
Disgusted 1.1%
Surprised 1.6%
Happy 0.4%
Calm 51.6%
Fear 6.4%
Confused 12.6%
Angry 6.8%

Feature analysis

Amazon

Person 67.1%

Captions

Microsoft

a close up of text on a white background 81.1%
a close up of text on a black background 78.6%
close up of text on a white background 78.5%

Text analysis

Amazon

FOGG
MUSEUM
ART MUSEUM
ART
ECEMBER
UNIVERSITY
TOJANUARY19
FOGG HARVARD UNIVERSITY
ECEMBER 4 TOJANUARY19
HARVARD
4
BEn
BEn Sao
Sao

Google

ה,
Ben
MUSEUM
DECEMBER4
La
ART
UNIVERSITY
HARVARD
19
דו
FOGG
TO
ה,דו Ben La FOGG ART MUSEUM HARVARD UNIVERSITY DECEMBER4 TO JANUARY 19
JANUARY