Human Generated Data

Title

[Street scene, Siemensstadt]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1003.97

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Street scene, Siemensstadt]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Automobile 99.9
Vehicle 99.9
Car 99.9
Transportation 99.9
Tarmac 99.7
Asphalt 99.7
Pedestrian 99.7
Human 99.7
Road 99.4
Person 99.3
Person 99.2
Person 99.1
Person 99
Person 98.4
Person 98.2
Person 98
Person 98
Person 96.9
Person 96.3
Person 95.8
Person 94
Path 93.8
Wheel 87.9
Machine 87.9
Wheel 87.9
Person 82
Zebra Crossing 78
Person 75.7
Sidewalk 67
Pavement 67
Clothing 66.1
Apparel 66.1
Coat 64.6
Person 62.7
Freeway 59.1
Overcoat 59.1
People 56.5
Person 45.6

Clarifai
created on 2019-11-16

people 99.9
group together 99.2
vehicle 99
group 98
adult 97.6
transportation system 97.3
man 95.7
many 95.1
one 95
two 94.6
several 94.5
child 92.9
administration 92.8
woman 90.4
four 88.1
war 87.9
leader 87.7
military 86.9
watercraft 86.6
three 84.5

Imagga
created on 2019-11-16

car 52.3
motor vehicle 33
transportation 24.2
vehicle 22.4
truck 21.9
road 18.1
automobile 17.2
auto 17.2
transport 16.4
tow truck 16.2
city 15.8
travel 14.8
sky 14.7
man 14.1
urban 14
wheeled vehicle 13.8
speed 13.7
water 13.3
drive 13.2
street 12.9
wheel 12
racer 11.4
industry 11.1
adult 11
people 10.6
building 10.5
sidewalk 10.5
landscape 10.4
industrial 10
person 9.8
traffic 9.5
motion 9.4
construction 9.4
outdoors 9.4
beach 9.3
black 9.2
old 9
accident 8.8
day 8.6
world 8.6
sea 8.6
fast 8.4
machine 8.2
light 8.1
sunset 8.1
wet 8
river 8
work 7.8
engine 7.8
destruction 7.8
male 7.8
motor 7.7
highway 7.7
power 7.6
ocean 7.6
intersection 7.4
danger 7.3
coast 7.2
portrait 7.1

Google
created on 2019-11-16

Motor vehicle 96.3
Vehicle 92.6
Car 90.5
Classic car 85.2
Classic 84.6
Vintage car 79.8
Antique car 56.9

Microsoft
created on 2019-11-16

vehicle 93.2
black and white 91
car 89.7
land vehicle 88.3
street 87.3
person 73.3
clothing 64
wheel 54.5

Face analysis

Amazon

AWS Rekognition

Age 24-38
Gender Female, 50%
Disgusted 49.7%
Happy 49.5%
Angry 49.9%
Calm 49.5%
Surprised 49.6%
Confused 49.5%
Sad 49.6%
Fear 49.7%

AWS Rekognition

Age 14-26
Gender Male, 50.4%
Angry 50.1%
Fear 49.5%
Confused 49.5%
Happy 49.5%
Calm 49.6%
Disgusted 49.6%
Surprised 49.7%
Sad 49.5%

AWS Rekognition

Age 13-23
Gender Female, 50.2%
Surprised 49.8%
Disgusted 49.5%
Happy 49.6%
Sad 49.7%
Calm 49.7%
Confused 49.6%
Fear 49.6%
Angry 49.5%

AWS Rekognition

Age 13-25
Gender Female, 50.2%
Angry 49.5%
Surprised 49.5%
Sad 49.8%
Fear 49.6%
Calm 49.6%
Happy 49.9%
Confused 49.5%
Disgusted 49.6%

Feature analysis

Amazon

Car 99.9%
Person 99.3%
Wheel 87.9%

Captions

Microsoft

a group of people standing on top of a car window 69.8%
a group of people standing on top of a car 67.2%
a group of people standing next to a car 67.1%