Human Generated Data

Title

[Tops of men's heads]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.206.23

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Tops of men's heads]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.206.23

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-18

Nature 85.1
Building 72.6
Outdoors 72.2
Human 67.7
Smoke 66.7
Food 64.1
Meal 64.1
Text 62.8
Bunker 62.3
Weather 57.5
Castle 57.3
Architecture 57.3
Fort 57.1
Furniture 55.6
Person 42

Clarifai
created on 2019-11-18

building 96.2
no person 96.2
house 95.9
people 95.2
monochrome 94.8
architecture 94.2
city 93.5
room 93.5
wall 93.3
light 89.5
adult 89.2
vehicle 88.8
art 87.9
black and white 87.4
wear 86.3
winter 86.2
furniture 85.1
landscape 85.1
vintage 83.1
calamity 83.1

Imagga
created on 2019-11-18

vehicle 25.9
negative 23.1
device 21.6
military vehicle 20.3
tracked vehicle 19.6
half track 19.4
film 18.9
technology 15.6
architecture 14.8
photographic paper 14.6
wheeled vehicle 14.4
hand 13.7
work 12.6
business 12.1
old 11.8
musical instrument 11.5
paper 11.3
machine 11.2
home 11.2
modern 10.5
laptop 10.3
black 10.2
conveyance 10.2
man 10.1
guitar 10
house 10
photographic equipment 10
city 10
male 9.9
bulldozer 9.7
people 9.5
computer 9.1
tractor 8.9
light 8.7
world 8.6
3d 8.5
travel 8.4
equipment 8.2
mousetrap 8.1
lifestyle 7.9
design 7.9
stringed instrument 7.8
high 7.8
floor 7.4
tourism 7.4
town 7.4
close 7.4
music 7.2
looking 7.2
person 7.1

Google
created on 2019-11-18

Microsoft
created on 2019-11-18

text 95.5
fog 78.7
black and white 75.3
white 69.7
old 68.1
monochrome 60.7
sky 59.6
house 51.2
dirty 11.5

Color Analysis

Feature analysis

Amazon

Person 42%

Categories

Imagga

text visuals 40.5%
interior objects 27.5%
paintings art 16.2%
food drinks 13.1%

Captions

Microsoft
created on 2019-11-18

a dirty old room 81%
an old photo of a person 53.9%
a black and white photo 52.8%

Text analysis

Amazon

U9

Google

-WILLIANS VAES MEANT
-WILLIANS
VAES
MEANT