Human Generated Data

Title

[Window]

Date

1950 or 1952

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.321.8

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Window]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1950 or 1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Silhouette 62.9
Spoke 60.2
Machine 60.2
Window 59.8
Face 59.7
Display 59.3
Electronics 59.3
LCD Screen 59.3
Screen 59.3
Monitor 59.3
Plant 57.4
Musician 55.5
Musical Instrument 55.5
Brick 55.2

Clarifai
created on 2019-05-29

monochrome 99.8
people 98.1
black and white 95.6
analogue 95.2
abstract 95.1
street 95.1
art 94.6
light 93.4
storm 93.3
landscape 92.9
window 92.2
portrait 92.1
city 91.7
vehicle 91.2
no person 88.8
rain 88.7
adult 85.3
reflection 85
weather 84.9
shadow 84.2

Imagga
created on 2019-05-29

car mirror 58
mirror 51.3
windshield wiper 46.2
television 41.6
mechanical device 37
reflector 34.8
telecommunication system 30.8
mechanism 27.7
device 21.2
screen 20.2
grunge 19.6
old 17.4
antique 17.3
windshield 16.7
black 15.6
texture 15.3
ancient 14.7
vintage 14.1
car 13.8
person 13.7
aged 13.6
material 13.4
protective covering 13.3
retro 13.1
adult 12.9
dirty 12.6
frame 12.5
wall 12
hair 11.9
border 11.8
portrait 11.6
damaged 11.4
face 11.4
light 11.4
dark 10.9
road 10.8
transportation 10.8
grungy 10.4
structure 10.2
space 10.1
people 10
art 9.8
pattern 9.6
vehicle 9.4
close 9.1
pretty 9.1
design 9
happy 8.8
driver 8.7
driving 8.7
covering 8.7
automobile 8.6
weathered 8.5
inside 8.3
rough 8.2
man 8.1
interior 8
smile 7.8
sitting 7.7
empty 7.7
film 7.7
attractive 7.7
rusty 7.6
drive 7.6
one 7.5
backdrop 7.4
business 7.3
gray 7.2
looking 7.2
body 7.2
paper 7.1
night 7.1

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

Captions

Microsoft

a close up of a person 47%