Human Generated Data

Title

[Woman driving a car with zinc on nose and lips]

Date

1940s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.422.5

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Woman driving a car with zinc on nose and lips]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Person 95
Human 95
Clothing 85.8
Helmet 85.8
Apparel 85.8
Glasses 74.7
Accessory 74.7
Accessories 74.7
Mirror 61.8
Portrait 60.4
Photo 60.4
Face 60.4
Photography 60.4
Car Mirror 56.1
Machine 55.6
Weather 55.5
Nature 55.5

Clarifai
created on 2019-05-30

vehicle 97.7
transportation system 95
people 92.4
war 92.1
car 90.7
old 89
monochrome 88.9
travel 87.4
dark 87.2
no person 86
light 85.9
military 85.1
city 84.5
desktop 83.9
urban 83.4
street 83
abandoned 82.4
accident 82.4
building 81.4
vintage 81.2

Imagga
created on 2019-05-30

iron lung 33.5
vehicle 33.2
cannon 32.9
respirator 26.8
weapon 24.7
gun 24.6
sky 22.9
car 22.7
device 22.4
breathing device 20.6
tank 18.5
travel 18.3
transportation 17
cloud 16.3
military vehicle 15.2
clouds 15.2
old 14.6
road 14.4
landscape 14.1
war 13.5
wheeled vehicle 12.5
engine 12.5
wheel 12.3
transport 11.9
industrial 11.8
architecture 11.8
sunset 11.7
track 11.5
industry 11.1
building 10.8
tourism 10.7
military 10.6
tracked vehicle 10.3
structure 10.3
water 10
automobile 9.6
light 9.4
locomotive 9.3
stone 9.3
power 9.2
air 9.2
street 9.2
machine 8.9
country 8.8
army 8.8
aircraft 8.5
drive 8.5
smoke 8.4
land 8.3
vintage 8.3
historic 8.2
scenery 8.1
metal 8
river 8
world 8
scenic 7.9
destruction 7.8
billboard 7.8
truck 7.7
auto 7.6
conveyance 7.6
windshield 7.6
field 7.5
environment 7.4
nuclear weapon 7.3
danger 7.3
sun 7.2
black 7.2
part 7.2
armored vehicle 7.1
grass 7.1
to 7.1

Google
created on 2019-05-30

Microsoft
created on 2019-05-30

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Female, 86.6%
Disgusted 0.4%
Angry 1%
Surprised 5.5%
Happy 1.9%
Calm 88.2%
Confused 1.6%
Sad 1.3%

Feature analysis

Amazon

Person 95%
Helmet 85.8%
Glasses 74.7%

Captions

Microsoft

a black and white photo of a train 29.1%
a black and white photo of a person 29%
a black and white photo of a person 28.9%