Human Generated Data

Title

[Man in chaise by window]

Date

1930-1935

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.303.3

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Man in chaise by window]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.303.3

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-05-29

Apparel 86.6
Clothing 86.6
Transportation 56.5
Train 56.5
Vehicle 56.5
Soil 55.5

Clarifai
created on 2019-05-29

people 96.9
war 93.5
vehicle 92.3
no person 92.2
military 91.4
industry 91
one 90.7
grinder 90.1
adult 90.1
man 89.4
monochrome 87.9
wear 87.8
retro 87.2
old 87
weapon 84.2
street 83.8
steel 81.9
art 81.8
transportation system 81.4
dirty 81.3

Imagga
created on 2019-05-29

anvil 31
industry 27.3
block 25.3
work 22
worker 21.3
metal 20.9
industrial 20.9
old 19.5
man 19.5
steel 19.4
construction 18.8
tool 17.5
machine 16.5
gun 16.2
device 16
equipment 15.6
labor 15.6
iron 15.2
vehicle 13.9
safety 13.8
factory 13.6
job 13.3
working 13.3
car 12.9
skill 12.5
weapon 12.3
manufacturing 11.7
stone 11
dirty 10.8
black 10.8
manual 10.7
war 10.6
rifle 10.6
building 10.4
fire 10.3
light 10
protection 10
wood 10
firearm 9.7
repair 9.6
automobile 9.6
heavy 9.5
power 9.2
wreck 9.2
danger 9.1
mask 9.1
camouflage 8.9
craft 8.9
skeleton 8.8
destruction 8.8
rust 8.7
concrete 8.6
uniform 8.4
environment 8.2
transportation 8.1
material 8
machinery 7.9
welder 7.9
weld 7.9
welding 7.9
dump 7.9
soldier 7.8
person 7.8
military 7.7
engine 7.7
auto 7.7
house 7.5
flame 7.5
smoke 7.4
heat 7.4
occupation 7.3
people 7.3
male 7.1
machinist 7

Google
created on 2019-05-29

Microsoft
created on 2019-05-29

drawing 86.9
black and white 83.5
old 41.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Female, 50.6%
Disgusted 1.6%
Sad 6.2%
Confused 1.3%
Calm 63.4%
Happy 21.7%
Angry 4%
Surprised 1.8%

Feature analysis

Amazon

Train 56.5%

Categories

Captions

Microsoft
created on 2019-05-29

an old photo of a person 43.3%