Human Generated Data

Title

[Toy train on windowsill]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1002.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Toy train on windowsill]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1002.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Wristwatch 81
Wood 61.2
Lock 56.9

Clarifai
created on 2019-11-16

no person 95.6
vehicle 95.1
container 94.1
still life 91
one 90.2
old 90.2
car 89.9
flame 89.9
vintage 88.9
food 86.8
technology 85.8
indoors 84.6
light 83.9
military 83.3
building 82.7
industry 82.7
drink 82.5
transportation system 82.1
wood 80.7
energy 79.5

Imagga
created on 2019-11-16

mechanism 67.7
device 63.6
shutter 52
mechanical device 39.2
film advance 34.5
lock 28.4
equipment 28
combination lock 26
metal 20.9
technology 20.8
antique 18.2
camera 18.1
old 17.4
aperture 16.5
business 16.4
object 16.1
close 16
silver 15.9
lens 14.8
fastener 14.8
security 14.7
black 14.4
instrument 14.3
safe 14.2
hand 14
retro 13.9
industry 13.7
compass 13.5
vintage 13.2
safety 12.9
photographic equipment 12.8
light meter 12.7
steel 12.4
storage 12.4
photography 12.4
open 11.7
magnetic compass 10.9
dial 10.8
computer 10.4
drive 10.4
chrome 10.4
control 10.2
work 10.2
closeup 10.1
data 10.1
protection 10
restraint 9.9
film 9.9
mechanical 9.7
secure 9.7
numbers 9.6
money 9.4
number 9.3
focus 9.3
regulator 9
disk 8.7
protect 8.7
classic 8.4
part 8.3
paper 7.8
turn 7.8
ancient 7.8
machine 7.7
age 7.6
hard 7.6
electronics 7.6
store 7.6
digital 7.3
tool 7.2
bearing 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

Color Analysis

Feature analysis

Amazon

Categories

Imagga

food drinks 90.3%
interior objects 5.3%
paintings art 2.2%

Captions

Microsoft
created on 2019-11-16

a close up of a window 57%
close up of a window 51.4%
a close up of a device 44.2%