Human Generated Data

Title

[Vases of brushes and artistic tools]

Date

late 1930's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.524.21

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Vases of brushes and artistic tools]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

late 1930's

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.524.21

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Furniture 98.3
Table 73.9
Human 66.7
Person 66.7
Art 58.2
Wood 55.6

Clarifai
created on 2019-11-19

no person 97.8
indoors 96.6
furniture 95.6
room 94.6
people 89.6
monochrome 85.5
chair 82.8
group 82.3
contemporary 79.9
interior design 79.8
window 79
military 79
luxury 78.4
wear 78.1
home 76.7
war 75.9
light 75.8
retro 75.5
paper 75.3
mirror 75.1

Imagga
created on 2019-11-19

windowsill 38.1
sill 30.7
structural member 25.1
house 25.1
architecture 24.6
building 20.1
support 19.6
room 18.8
interior 18.6
home 18.3
quill 16.5
wall 15
old 14.6
decor 14.1
modern 14
window 14
structure 14
pen 13.5
travel 13.4
decoration 13.2
furniture 13
table 13
device 12.9
luxury 12.9
design 12.4
floor 12.1
newspaper 11.7
hotel 11.5
style 11.1
stone 11
apartment 10.5
urban 10.5
space 10.1
city 10
writing implement 9.9
product 9.7
vase 9.7
lifestyle 9.4
plant 9.3
relaxation 9.2
color 8.9
living 8.5
tourism 8.2
indoor 8.2
light 8
water 8
glass 7.8
creation 7.8
empty 7.7
construction 7.7
balcony 7.7
residential 7.7
clean 7.5
wood 7.5
lamp 7.4
retro 7.4
exterior 7.4
office 7.3
detail 7.2
black 7.2
art 7.2
book 7.1
indoors 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

indoor 97.2
sketch 96.7
drawing 96.5
vase 80.1
text 79.7
painting 63.2
white 61
cluttered 20.5

Color Analysis

Feature analysis

Amazon

Person 66.7%

Categories

Imagga

interior objects 96.6%
food drinks 1.1%

Captions