Human Generated Data

Title

[Piano]

Date

1950 or 1952

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.321.9

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Piano]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1950 or 1952

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.321.9

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Clarifai
created on 2019-05-29

no person 99.4
people 99
one 97.3
monochrome 93.4
light 92.9
vehicle 90.4
room 90.4
indoors 89.2
street 89.1
adult 88.4
watercraft 86.8
war 85.4
architecture 83.8
window 83.3
offense 83
military 82.3
water 81.7
aircraft 78.8
man 77.6
transportation system 76.9

Imagga
created on 2019-05-29

washbasin 39.2
basin 35.4
interior 31.8
room 29.9
chair 28.9
device 28
house 27.6
furniture 26.9
vessel 26.9
architecture 25
home 22.3
wall 20.5
window 20.1
modern 19.6
toilet 19.5
seat 17.2
cell 17.1
light 16.7
instrument of execution 16.6
bathroom 16
old 16
container 15.9
building 15.9
instrument 15.7
clean 15
wood 15
design 14.1
glass 14
electric chair 13.8
support 13.7
luxury 13.7
decor 13.3
floor 13
furnishing 12.9
inside 12
wooden 11.4
indoors 11.4
bath 11.4
decoration 10.8
washstand 10.7
lamp 10.6
new 10.5
contemporary 10.3
indoor 10
door 10
city 10
structure 9.9
residential 9.6
water 9.3
shower 9.2
stylish 9
metal 8.8
steel 8.8
sink 8.8
sconce 8.7
urban 8.7
wash 8.7
empty 8.6
business 8.5
open 8.1
equipment 7.9
faucet 7.9
tub 7.7
industry 7.7
windows 7.7
hotel 7.6
mirror 7.6
hygiene 7.6
bracket 7.3
travel 7

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

wall 95.6
black and white 95
piano 94
furniture 86.6
indoor 86.1
white 69.7
monochrome 55.3
chair 51.4

Color Analysis

Categories

Imagga

interior objects 99.8%

Captions

Microsoft
created on 2019-05-29

a person sitting in a dark room 34.4%
a person in a dark room 34.3%
a person in a dark room 34.2%