Human Generated Data

Title

[Street scene with merchant horse cart, viewed from above, Siemensstadt]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1003.91

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Street scene with merchant horse cart, viewed from above, Siemensstadt]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1003.91

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.4
Human 99.4
Person 99.3
Person 99
Pedestrian 98.9
Person 98.8
Person 97.9
Person 96.9
Person 96.4
Person 96.4
Person 94.4
Person 92.3
Person 91
Person 88.3
Person 88.1
Person 87.6
Transportation 86.6
Vehicle 86.5
Person 86.2
People 85.4
Train 85.3
Person 82
Asphalt 81.9
Tarmac 81.9
Person 81.1
Crowd 80.7
Person 80.7
Person 78.6
Person 76.2
Person 74.7
Person 72.4
Person 69.9
Road 69.5
Terminal 68
Train Station 68
Person 62.4
Railway 59.1
Train Track 59.1
Rail 59.1
Funeral 59
Building 56.5
Person 51.5
Person 48.6

Clarifai
created on 2019-11-16

people 99.9
vehicle 99.7
group together 99.6
transportation system 98.9
adult 98.8
group 98.7
watercraft 98.2
man 96
military 95.7
many 95.4
aircraft 94.9
one 94.2
wear 92.5
woman 92.4
child 91.3
recreation 91.2
war 91.1
several 90.4
two 90.4
four 90

Imagga
created on 2019-11-16

passenger 46.1
vehicle 35.3
cockpit 29.4
transportation 27.8
locomotive 23.8
transport 21
power 18.5
steam 17.5
car 17.4
old 17.4
landscape 17.1
industry 16.2
smoke 15.8
train 15.5
travel 14.8
railway 14.7
industrial 14.5
wheel 14.1
vintage 14.1
steam locomotive 14.1
sky 14
railroad 13.7
black 13.2
track 13
machine 12.9
road 12.6
engine 12.5
wheeled vehicle 11.5
automobile 11.5
metal 11.3
outdoors 11.2
danger 10.9
water 10.7
truck 10.6
steel 10.6
coal 8.9
building 8.7
military 8.7
war 8.7
field 8.4
rural 7.9
television 7.9
ship 7.9
rail 7.8
factory 7.8
antique 7.8
cloud 7.7
outside 7.7
construction 7.7
pollution 7.7
auto 7.7
tractor 7.5
lake 7.3
sunset 7.2
weapon 7.2
summer 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

outdoor 99
aircraft 96.8
airplane 94.7
text 89.7
black and white 85.2
person 85.2
people 82.7
water 80.4
man 78.2
clothing 68.9
vehicle 62.6
group 57.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 28-44
Gender Female, 50%
Surprised 49.6%
Disgusted 49.5%
Happy 49.5%
Sad 49.5%
Calm 49.5%
Confused 49.6%
Fear 50.3%
Angry 49.5%

AWS Rekognition

Age 27-43
Gender Female, 50%
Happy 49.5%
Sad 49.6%
Disgusted 49.6%
Fear 49.6%
Calm 49.7%
Angry 49.6%
Confused 49.8%
Surprised 49.6%

AWS Rekognition

Age 3-11
Gender Male, 50.4%
Angry 49.5%
Happy 49.5%
Fear 49.6%
Disgusted 49.5%
Sad 49.8%
Calm 50.1%
Surprised 49.6%
Confused 49.5%

AWS Rekognition

Age 18-30
Gender Male, 50.4%
Calm 49.6%
Disgusted 49.5%
Sad 50%
Happy 49.5%
Angry 49.7%
Fear 49.6%
Surprised 49.5%
Confused 49.6%

AWS Rekognition

Age 23-37
Gender Female, 50.3%
Calm 49.7%
Surprised 49.5%
Disgusted 49.5%
Happy 49.6%
Angry 49.6%
Confused 49.6%
Sad 49.8%
Fear 49.5%

AWS Rekognition

Age 30-46
Gender Female, 50%
Surprised 49.6%
Sad 49.8%
Confused 49.5%
Happy 49.8%
Disgusted 49.5%
Fear 49.6%
Angry 49.5%
Calm 49.7%

AWS Rekognition

Age 12-22
Gender Male, 50.4%
Calm 49.6%
Sad 49.5%
Fear 49.5%
Disgusted 49.5%
Confused 50.4%
Surprised 49.5%
Happy 49.5%
Angry 49.5%

AWS Rekognition

Age 21-33
Gender Male, 50.4%
Disgusted 49.5%
Happy 49.6%
Angry 50.1%
Fear 49.5%
Calm 49.5%
Confused 49.6%
Surprised 49.5%
Sad 49.7%

AWS Rekognition

Age 23-35
Gender Male, 50.3%
Sad 49.5%
Surprised 49.6%
Confused 49.5%
Angry 49.5%
Calm 50.3%
Fear 49.5%
Happy 49.5%
Disgusted 49.5%

Feature analysis

Amazon

Person 99.4%

Text analysis

Amazon

Shuyle

Google

Th am
am
Th