Human Generated Data

Title

Interior with Figures

Date

1930

People

Artist: Eugene Berman, American 1899 - 1972

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Benjamin Rowland, M10035

Human Generated Data

Title

Interior with Figures

People

Artist: Eugene Berman, American 1899 - 1972

Date

1930

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Person 98.9
Human 98.9
Person 98.8
Art 97.4
Drawing 96.4
Sketch 90.5
Person 88.6
Painting 77.5

Clarifai
created on 2019-10-29

print 100
art 99.9
illustration 99.8
people 99.7
painting 99.1
engraving 98.7
adult 97
vintage 96.6
group 95
man 94.9
furniture 94.9
visuals 94.7
etching 93.4
old 93.2
lithograph 91.7
wear 91.7
retro 90.7
woodcut 90.5
text 90.4
one 90.4

Imagga
created on 2019-10-29

old 46.7
wall 40.5
stucco 33.7
grunge 31.5
vintage 30.6
texture 30.6
architecture 27.4
building 26.4
ancient 24.2
door 24
window 23.4
aged 22.6
dirty 22.6
antique 22.5
retro 21.3
textured 19.3
brick 18.4
stone 17.5
weathered 17.1
grungy 17.1
house 16.8
structure 16.2
brown 16.2
rough 15.5
detail 15.3
history 15.2
art 15.1
surface 15
frame 15
city 15
decoration 14.8
paint 14.5
paper 14.2
device 14.1
exterior 13.8
historic 13.8
construction 13.7
pattern 13.7
graffito 12.9
rust 12.5
damaged 12.4
urban 12.2
empty 12
blank 12
abandoned 11.7
material 11.6
travel 11.3
street 11
arch 10.9
wood 10.8
tourism 10.7
worn 10.5
drawing 10.3
black 10.2
design 10.1
cement 9.7
decay 9.6
sketch 9.4
historical 9.4
glass 9.3
space 9.3
page 9.3
color 8.9
wooden 8.8
home 8.8
torn 8.7
parchment 8.6
yellow 8.6
concrete 8.6
rusty 8.6
wallpaper 8.4
snow 7.9
rural 7.9
crack 7.7
messy 7.7
broken 7.7
culture 7.7
stained 7.7
architectural 7.7
village 7.7
painted 7.6
close 7.4
tourist 7.4

Google
created on 2019-10-29

Painting 86.2
Art 85
Modern art 84.8
Picture frame 83.9
Illustration 80.4
Visual arts 77.8
Drawing 74.6
Printmaking 65.7
Room 65.7
Artwork 60.6

Microsoft
created on 2019-10-29

drawing 99.8
sketch 99.6
text 97.7
art 97.3
painting 95.5
illustration 91.3
gallery 85.6
child art 85.4
cartoon 68.7
person 66.6
room 46.8

Face analysis

Amazon

AWS Rekognition

Age 17-29
Gender Male, 50.2%
Calm 49.6%
Disgusted 49.6%
Sad 49.5%
Angry 50.3%
Confused 49.5%
Fear 49.6%
Happy 49.5%
Surprised 49.5%

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a black sign with white text 33.6%
a black sign with white letters 30.6%
a white sign with black text 30.5%

Text analysis

Amazon

fe