Human Generated Data

Title

[Street scene with horse cart, viewed from above, Siemensstadt]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1003.85

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Street scene with horse cart, viewed from above, Siemensstadt]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1003.85

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.5
Human 99.5
Animal 99.1
Horse 99.1
Mammal 99.1
Road 96.6
Tarmac 96
Asphalt 96
Machine 94.9
Wheel 94.9
Transportation 82.3
Vehicle 82.1
Pedestrian 80.8
Building 76.5
Street 76.1
Town 76.1
Urban 76.1
City 76.1
Nature 76
Outdoors 75.3
Wagon 75.2
Countryside 68.6
Horse Cart 66.1
Path 63.8
Funeral 58.8
Rural 56.6
Person 51.4

Clarifai
created on 2019-11-16

people 99.2
vehicle 98.2
one 97.3
transportation system 97
two 95.6
monochrome 95.4
adult 95.3
group together 94.2
no person 94.2
man 93.4
group 90.7
street 88.3
woman 86.2
child 83.6
travel 80.7
road 80.6
recreation 79.8
three 79.5
military 78.5
outdoors 77.9

Imagga
created on 2019-11-16

piano 87.7
keyboard instrument 64.3
stringed instrument 63.4
percussion instrument 61.5
upright 58.1
musical instrument 43.6
grand piano 43.3
travel 16.2
man 13.4
city 13.3
track 12.7
transportation 12.5
old 11.8
urban 11.3
black 10.8
walking 10.4
people 10
water 10
ocean 9.9
sidewalk 9.4
sea 9.4
light 9.3
equipment 9.2
road 9
music 9
outdoors 8.3
tourism 8.2
speed 8.2
transport 8.2
device 8.1
tie 7.9
scene 7.8
empty 7.7
line 7.7
construction 7.7
industry 7.7
beach 7.6
dark 7.5
silhouette 7.4
street 7.4
building 7.2
male 7.1
summer 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

bench 99.3
outdoor 93.2
black and white 90.9
text 79.2
park 75.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 38-56
Gender Male, 53.3%
Happy 45.1%
Angry 52.2%
Disgusted 45.1%
Calm 45.3%
Fear 45.9%
Surprised 45.6%
Confused 45.2%
Sad 45.5%

Feature analysis

Amazon

Person 99.5%
Horse 99.1%
Wheel 94.9%

Categories

Captions