Human Generated Data

Title

[Houses and woman walking on road]

Date

1931

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.178.16

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Houses and woman walking on road]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1931

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-18

Nature 99.8
Outdoors 99.5
Building 97.1
Shelter 96.7
Countryside 96.7
Rural 96.7
Human 94.9
Person 94.9
Hut 89.9
Shack 83.4
Asphalt 68.8
Tarmac 68.8
Road 66.5
Housing 66.3
Person 64.4
People 62
Path 59.2
Wall 55.9

Clarifai
created on 2019-11-18

people 99.1
street 98.9
monochrome 98
home 96.8
no person 96.4
architecture 95.8
house 95.7
building 95.4
group 90.5
adult 90.4
road 90.2
black and white 89.3
group together 89
town 88.4
one 87.6
man 86.9
two 86.7
shadow 85.7
city 85.7
military 83.8

Imagga
created on 2019-11-18

barn 99.1
building 77.2
roof 77.1
thatch 72.1
farm building 70.4
structure 44.3
protective covering 40.7
old 37
architecture 35.3
house 33.5
covering 28.8
rural 28.2
trees 22.3
country 22
sky 20.4
countryside 20.1
home 18.4
ancient 18.2
farm 17.9
clouds 17.8
historic 17.4
scenic 15.8
landscape 14.9
village 14.6
stone 14.5
scenery 13.6
england 13.4
traditional 13.3
wall 12.9
cottage 12.8
grass 12.7
wood 12.5
door 12.4
hut 11.8
history 11.6
wooden 11.4
church 11.1
field 10.9
religion 10.8
snow 10.7
rustic 10.6
brick 10.5
tree 10
tourism 9.9
travel 9.9
houses 9.7
property 9.7
windows 9.6
residential 9.6
winter 9.4
peaceful 9.2
mountain 8.9
construction 8.6
tradition 8.3
exterior 8.3
outdoors 8.2
landmark 8.1
scene 7.8
shelter 7.7
farming 7.6
summer 7.1
agriculture 7

Google
created on 2019-11-18

Microsoft
created on 2019-11-18

building 99.3
outdoor 95.4
house 94.8
black and white 92.7
text 79.3
old 76.3
window 67.8
sky 65
monochrome 58.3
stone 21.5

Face analysis

Amazon

AWS Rekognition

Age 12-22
Gender Male, 50.2%
Surprised 49.5%
Calm 49.5%
Fear 49.5%
Angry 49.5%
Happy 49.5%
Disgusted 49.5%
Confused 49.5%
Sad 50.4%

Feature analysis

Amazon

Person 94.9%

Captions

Microsoft

a vintage photo of a house 92.8%
a vintage photo of an old brick building 86.7%
a vintage photo of an old building 86.6%