Human Generated Data

Title

[Garage/Storage space]

Date

1930's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.486.35

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Garage/Storage space]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930's

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.486.35

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 99.2
Person 99.2
Person 97
Outdoors 82.8
Photography 72.6
Photo 72.6
Water 71.3
Nature 68.1
Photographer 63.3
Face 56.5
Portrait 56.5
Vehicle 55.6
Transportation 55.6
Land 55.5

Clarifai
created on 2019-11-19

people 99.9
group together 98.5
adult 97.8
man 96.7
monochrome 96
group 96
one 95.9
two 95.7
war 95.5
military 93.9
vehicle 93.9
street 93.5
four 89.4
three 88.9
woman 88.2
transportation system 87.8
wear 87.2
child 86.9
skirmish 86.6
soldier 85.3

Imagga
created on 2019-11-19

graffito 38.9
freight car 30.8
dark 25.9
car 25.8
decoration 25.7
wheeled vehicle 22.9
landscape 19.3
vehicle 18.7
light 18.7
sky 16.6
travel 15.5
water 14
sun 13.7
building 13.7
scene 13
sunset 12.6
old 12.5
silhouette 12.4
sea 11.7
night 11.5
tree 11.5
man 11.4
black 11.4
evening 11.2
summer 10.9
holiday 10.7
trees 10.7
people 10.6
beach 10.3
ocean 10.1
conveyance 9.9
street 9.2
outdoor 9.2
park 9.2
tourism 9.1
vacation 9
tunnel 8.9
scenic 8.8
boat 8.8
wall 8.7
fishing 8.7
dusk 8.6
path 8.5
sunrise 8.4
destination 8.4
house 8.4
city 8.3
dirty 8.1
wet 8
sand 8
misty 7.9
forest 7.8
spooky 7.8
mystery 7.7
swing 7.7
stone 7.6
rain 7.5
peaceful 7.3
road 7.2
person 7.2
fantasy 7.2
coast 7.2
shovel 7.2
adult 7.1
mechanical device 7.1
day 7.1
structure 7.1
architecture 7
season 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

ground 96
black and white 95.9
monochrome 83
text 82.4
street 72.7
house 62.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 32-48
Gender Male, 50.3%
Fear 49.6%
Calm 49.6%
Angry 49.6%
Happy 49.5%
Confused 49.5%
Surprised 49.5%
Sad 50.2%
Disgusted 49.5%

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft
created on 2019-11-19

a person in a dark room 39.6%