Human Generated Data

Title

[View from above of people walking on sidewalk, Siemensstadt]

Date

1930's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.447.14

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[View from above of people walking on sidewalk, Siemensstadt]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930's

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-04-05

Person 96.4
Human 96.4
Pedestrian 95.5
Person 94.6
Road 93.6
Tarmac 92.3
Asphalt 92.3
Person 90.5
Person 88.9
Person 88.8
Field 81.1
People 79
Building 78.7
City 69.2
Urban 69.2
Town 69.2
Street 69.2
Sports 67.9
Skateboard 67.9
Sport 67.9
Photo 60.6
Photography 60.6
Corridor 58.4
Architecture 57.1
Zebra Crossing 55.9

Clarifai
created on 2021-04-05

people 99.9
group together 99.2
adult 97.5
two 97.2
one 96.6
many 96.6
child 95.5
step 95.1
athlete 94.3
man 93.5
group 93.1
competition 92.6
wear 90.2
three 89.9
woman 89.6
street 88.9
recreation 87.7
boy 85.1
vehicle 84.1
transportation system 83.3

Imagga
created on 2021-04-05

skateboard 25.9
wheeled vehicle 21.9
window screen 20.1
board 20
street 18.4
screen 18.1
sport 17.6
people 16.7
vehicle 16.7
man 16.2
outdoor 15.3
road 14.4
sill 14.4
outdoors 14.2
barrier 13.8
support 13.4
old 13.2
wall 13.2
protective covering 13.1
city 12.5
walking 12.3
urban 12.2
conveyance 12.2
covering 12
speed 11.9
game 11.6
structural member 11.4
line 11.1
grunge 11.1
summer 10.9
exercise 10.9
fitness 10.8
tennis 10.7
court 10.7
track 10.7
travel 10.6
walk 10.5
adult 10.3
device 10.3
action 10.2
structure 10.1
active 9.9
transportation 9.9
recreation 9.9
athlete 9.8
person 9.5
motion 9.4
day 9.4
windowsill 8.8
obstruction 8.8
sidewalk 8.7
light 8.7
step 8.6
empty 8.6
black 8.4
texture 8.3
sports 8.3
leisure 8.3
building 8.2
competition 8.2
alone 8.2
dirty 8.1
shadow 8.1
lifestyle 7.9
doormat 7.9
antique 7.8
space 7.8
play 7.8
running 7.7
fun 7.5
vintage 7.4
retro 7.4
playing 7.3
aged 7.2
history 7.2
life 7.1
male 7.1

Google
created on 2021-04-05

Microsoft
created on 2021-04-05

person 87.7
text 84.8
black and white 83
clothing 75
water 54.4

Face analysis

Amazon

AWS Rekognition

Age 3-11
Gender Female, 52.3%
Calm 47.5%
Sad 41.4%
Happy 6.4%
Surprised 1.7%
Angry 1.1%
Confused 0.8%
Disgusted 0.7%
Fear 0.5%

Feature analysis

Amazon

Person 96.4%
Skateboard 67.9%

Captions

Microsoft

a group of people standing on top of a suitcase 60%
a group of people standing next to a suitcase 59.9%
a group of baseball players standing on top of a suitcase 25.2%