Human Generated Data

Title

[Two men on motorcycles and woman in front of house]

Date

early 1930s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.291.18

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Two men on motorcycles and woman in front of house]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

early 1930s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Vehicle 99.7
Transportation 99.7
Motorcycle 99.7
Machine 98.9
Wheel 98.9
Person 98.6
Human 98.6
Person 98.1
Person 97
Person 94.4
Wheel 93.8
Clothing 80.6
Apparel 80.6
Person 77.7
Motor 68.5
Moped 67.1
Motor Scooter 67.1
Vespa 67.1
Plant 65.3
Coat 61.8
Outdoors 61.7
Urban 56.1
Overcoat 55.3

Clarifai
created on 2019-05-29

people 100
adult 99.6
group together 99.2
man 97.3
vehicle 96.8
group 96.2
military 95.7
transportation system 93.9
war 93.3
soldier 92.8
two 91.8
one 91.4
administration 91.1
child 90.4
uniform 89.9
woman 89.6
three 88.2
wear 86.6
several 85.9
many 85.8

Imagga
created on 2019-05-29

wheeled vehicle 37.6
conveyance 32.2
vehicle 30.3
kin 29.6
wheelchair 25.9
tricycle 23
man 22.2
street 21.2
motor scooter 21.1
city 20.8
adult 19.4
sidecar 18.6
people 17.3
male 16.4
chair 15.4
seat 14.8
building 14
person 13.5
transportation 13.4
black 12.6
urban 12.2
outdoors 11.9
bike 11.7
bicycle 11.1
women 11.1
sidewalk 11
protection 10.9
old 10.4
standing 10.4
walking 10.4
portrait 10.4
sport 10
road 9.9
outdoor 9.9
travel 9.9
lifestyle 9.4
safety 9.2
transport 9.1
architecture 8.6
men 8.6
wheel 8.5
help 8.4
danger 8.2
landmark 8.1
clothing 8.1
accident 7.8
dark 7.5
human 7.5
tourism 7.4
dirty 7.2
to 7.1

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

outdoor 99.9
motorcycle 91.3
person 89.3
wheel 89.1
clothing 87.3
tire 84.5
land vehicle 83.2
black and white 81.5
vehicle 80
man 76.8

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Female, 52.1%
Surprised 45.3%
Calm 45.8%
Happy 51.1%
Confused 45.6%
Sad 45.7%
Disgusted 46.1%
Angry 45.4%

Feature analysis

Amazon

Motorcycle 99.7%
Wheel 98.9%
Person 98.6%

Captions

Microsoft

a person riding a horse in front of a building 87.3%
a person riding a horse in a street 87.2%
a person riding a horse on a street 86.6%