Human Generated Data

Title

[People in street]

Date

1931-1932

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.295.2

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[People in street]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1931-1932

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Person 99.7
Human 99.7
Person 99
Person 98.7
Person 98.6
Person 98.1
Person 95.3
Pedestrian 94.8
Crowd 87.3
Clothing 87
Apparel 87
Accessory 82.9
Tie 82.9
Accessories 82.9
Architecture 75.9
Building 75.9
Face 72.4
Urban 71.6
People 69.7
City 68.5
Town 68.5
Coat 67.3
Overcoat 67.3
Clock Tower 66.5
Tower 66.5
Vehicle 61.4
Train 61.4
Transportation 61.4
Suit 60.9
Photo 60.3
Photography 60.3
Downtown 58.3
Parade 58.1

Clarifai
created on 2019-05-29

people 100
many 99.7
group 99.5
group together 99.3
adult 96.9
man 95.4
crowd 93.5
street 92.7
several 92.3
war 91
administration 90.7
vehicle 89
monochrome 88.9
military 88.6
woman 85.8
transportation system 84.4
wear 84.3
leader 84.2
railway 77.7
home 77.6

Imagga
created on 2019-05-29

architecture 43.1
building 42.7
passenger car 42.6
city 36.6
car 36.4
wheeled vehicle 35.4
street 34.1
old 30
conveyance 28.2
urban 27.1
vehicle 22.7
travel 22.5
buildings 21.7
tramway 21.5
house 20
tourism 19.8
town 18.6
houses 18.4
tower 17.9
stone 16.9
historic 16.5
church 15.7
window 15.7
road 15.4
history 15.2
exterior 14.7
sky 14.7
tourist 13.7
streetcar 12.6
windows 12.5
lamp 12.4
brick 12.2
historical 12.2
ancient 12.1
landmark 11.7
wall 11.1
vintage 10.8
architectural 10.6
structure 10.3
metropolitan 10.1
center 10.1
cars 9.8
lantern 9.8
scene 9.5
famous 9.3
square 9
arch 8.7
day 8.6
cityscape 8.5
energy 8.4
power 8.4
place 8.4
people 8.4
industrial 8.2
roof 8.1
religion 8.1
balcony 7.9
pavement 7.9
factory 7.7
tree 7.7
industry 7.7
medieval 7.7
capital 7.6
prison 7.6
outdoors 7.5
holiday 7.2
boiler 7.2
university 7.1
sidewalk 7.1
night 7.1

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

outdoor 99
person 94.7
clothing 94
building 92.4
man 89.7
black and white 87.8
people 86.2
sky 76.3
street 73
group 63.5
skyscraper 51.9
crowd 28.4

Face analysis

Amazon

Google

AWS Rekognition

Age 35-52
Gender Female, 50.7%
Happy 45.1%
Surprised 45.1%
Confused 45.1%
Angry 45.2%
Sad 53.4%
Calm 45.9%
Disgusted 45.3%

AWS Rekognition

Age 26-43
Gender Female, 54.2%
Confused 45.5%
Disgusted 47.9%
Angry 45.8%
Happy 45.3%
Calm 46.1%
Sad 49.2%
Surprised 45.2%

AWS Rekognition

Age 38-57
Gender Male, 50.3%
Confused 49.6%
Happy 49.5%
Angry 49.7%
Sad 49.7%
Surprised 49.5%
Calm 50%
Disgusted 49.5%

AWS Rekognition

Age 29-45
Gender Female, 51.4%
Disgusted 45.2%
Calm 47.7%
Confused 45.3%
Surprised 45.4%
Angry 45.7%
Sad 49.7%
Happy 45.9%

AWS Rekognition

Age 35-52
Gender Male, 50.3%
Sad 50.7%
Disgusted 47.8%
Surprised 45.2%
Calm 45.6%
Angry 45.4%
Confused 45.2%
Happy 45.1%

AWS Rekognition

Age 27-44
Gender Female, 50.3%
Disgusted 49.5%
Calm 49.6%
Surprised 49.5%
Sad 49.6%
Angry 49.5%
Confused 49.5%
Happy 50.2%

AWS Rekognition

Age 26-43
Gender Female, 52.8%
Disgusted 45.1%
Sad 46%
Confused 45.1%
Calm 52.4%
Happy 46%
Angry 45.2%
Surprised 45.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Tie 82.9%
Clock Tower 66.5%
Train 61.4%

Captions

Microsoft

a group of people walking down a street 98.3%
a group of people walking down the street 98.2%
a group of people standing next to a train 91.3%