Human Generated Data

Title

[School children on sidewalk, viewed from above, Siemensstadt]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1003.87

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[School children on sidewalk, viewed from above, Siemensstadt]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Pedestrian 98.5
Human 98.5
Person 98.4
Person 98.2
Person 97.5
Person 95.9
Asphalt 94.9
Tarmac 94.9
Bike 92
Vehicle 92
Bicycle 92
Transportation 92
Road 90.6
Person 90
Duel 83.2
Light 71.3
Flare 71.3
Field 61.2
Building 58.2
Zebra Crossing 57.1
Urban 57
Street 57
City 57
Town 57
Archaeology 56
Soil 55.7
Silhouette 55.7
Clothing 55.1
Apparel 55.1

Clarifai
created on 2019-11-16

people 99.9
group together 98.5
adult 97.6
man 97.6
one 96.9
monochrome 96.8
street 96
vehicle 95.3
transportation system 95
two 94.8
woman 92.5
group 92
child 91.9
wear 91.2
action 87.5
athlete 87.4
sports equipment 85.2
many 85
competition 84.6
three 83.3

Imagga
created on 2019-11-16

skateboard 43.6
wheeled vehicle 41.1
board 38.2
vehicle 29.5
man 21.1
conveyance 21
windowsill 18.4
people 17.8
sport 16.8
street 16.5
sill 16.2
walk 14.3
sidewalk 13.9
exercise 13.6
wall 13.1
urban 13.1
support 13
structural member 12.4
device 12
fitness 11.7
transportation 11.6
walking 11.4
inclined plane 11.2
speed 11
road 10.8
black 10.8
city 10.8
race 10.5
runner 10.4
action 10.2
transport 10
water 10
active 9.9
person 9.7
outdoors 9.7
track 9.6
building 9.5
light 9.3
athlete 9.2
outdoor 9.2
competition 9.1
silhouette 9.1
old 9
adult 9
shadow 9
machine 8.6
concrete 8.6
men 8.6
motion 8.6
legs 8.5
travel 8.4
leisure 8.3
mechanical device 8.2
business 7.9
barrier 7.9
life 7.8
architecture 7.8
male 7.8
empty 7.7
dark 7.5
alone 7.3
game 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

black and white 90.4
person 89.7
street 82.8
water 74.4
text 58.4
monochrome 52.6

Face analysis

Amazon

AWS Rekognition

Age 29-45
Gender Male, 50%
Sad 49.8%
Disgusted 49.5%
Surprised 49.5%
Happy 49.7%
Angry 49.5%
Fear 49.5%
Confused 49.5%
Calm 49.9%

AWS Rekognition

Age 13-23
Gender Female, 50.1%
Angry 49.5%
Happy 49.5%
Disgusted 49.5%
Sad 49.5%
Calm 49.5%
Surprised 49.5%
Confused 50.5%
Fear 49.5%

Feature analysis

Amazon

Person 98.4%
Bicycle 92%

Captions

Microsoft

a man standing in front of a window 64%
a man standing next to a window 56%