Human Generated Data

Title

[Mills College, Oakland, California]

Date

1936-1937

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.111.28

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Mills College, Oakland, California]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936-1937

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.111.28

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-23

Back 99.9
Body Part 99.9
Clothing 99.1
Coat 99.1
Person 98.5
Adult 98.5
Male 98.5
Man 98.5
Person 97.9
Water 97.4
Shorts 94.5
Outdoors 88.7
Gravel 83.2
Road 83.2
Head 69.9
Person 68.7
Nature 66.7
Walking 66
Pond 56.6
Angler 56.5
Fishing 56.5
Leisure Activities 56.5
Puddle 56.4
Path 56.3
Tarmac 55.9
Canal 55.8
Garden 55.5
Gardener 55.5
Gardening 55.5
Standing 55.2
Soil 55.1

Clarifai
created on 2018-08-23

people 99.9
adult 98.8
one 98.7
monochrome 96.2
group together 95.1
two 94.9
man 94.3
woman 93.1
street 91.3
administration 88.7
wear 88.6
vehicle 85.9
war 85.7
military 82.8
group 82.6
child 82.1
veil 81.4
leader 78.7
action 76.8
boy 75.8

Imagga
created on 2018-08-23

step 30.9
device 29.3
sprinkler 29.3
mechanical device 26
support 23.6
wall 22.7
structure 20.2
mechanism 19.3
old 17.4
stone 17
fountain 15.5
sidewalk 15.5
street 12.9
man 12.8
memorial 12
building 11.9
gravestone 11.8
road 11.7
sky 11.5
outdoors 11.2
history 10.7
black 10.2
architecture 10.2
people 10
vintage 9.9
travel 9.9
antique 9.5
barrier 9.2
city 9.2
park 9.1
texture 9
ancient 8.7
concrete 8.6
grunge 8.5
danger 8.2
adult 7.8
brick 7.7
outdoor 7.7
person 7.5
house 7.5
dark 7.5
water 7.3
detail 7.2
art 7.2
trees 7.1

Google
created on 2018-08-23

Microsoft
created on 2018-08-23

outdoor 96.4

Color Analysis

Feature analysis

Amazon

Person 98.5%
Adult 98.5%
Male 98.5%
Man 98.5%

Captions