Human Generated Data

Title

[Man working on table]

Date

1938

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.232.7

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Man working on table]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Person 99.1
Human 99.1
Furniture 97
Bar Stool 76
Weaponry 63.2
Weapon 63.2
Chair 62.3
Forge 59.8
Clothing 57.5
Apparel 57.5
Tripod 55.7

Clarifai
created on 2019-11-19

people 99.7
adult 97.6
one 95.8
man 94.8
administration 92.2
two 91.9
home 90.4
war 90.2
wear 89.8
military 88.3
street 87.9
group 87.8
soldier 84.9
group together 82.8
leader 82.6
vehicle 81.4
three 80.9
weapon 80.5
room 80.2
offense 80.1

Imagga
created on 2019-11-19

musical instrument 49.9
wind instrument 36.3
man 27.5
harmonica 21
accordion 20
male 18.4
free-reed instrument 17.4
device 17
stringed instrument 17
keyboard instrument 16.9
washboard 16.2
old 15.3
building 15.2
adult 15
person 14.4
people 13.9
black 13.2
rifle 12.7
urban 12.2
violin 12
men 12
city 11.6
weapon 11.5
brass 11.2
architecture 10.9
gun 10.7
fashion 10.5
bowed stringed instrument 10.5
construction 10.3
industry 10.2
window 10.1
vintage 9.9
business 9.7
portrait 9.7
metal 9.6
wall 9.4
lifestyle 9.4
industrial 9.1
worker 8.9
steel 8.8
working 8.8
home 8.8
repair 8.6
safety 8.3
indoor 8.2
retro 8.2
firearm 8.2
music 8.2
life 8.1
banjo 8.1
job 8
indoors 7.9
standing 7.8
hand 7.6
room 7.6
one 7.5
inside 7.4
street 7.4
occupation 7.3
protection 7.3
dirty 7.2
office 7.2

Google
created on 2019-11-19

Snapshot 83.3
Facade 56.5
Ladder 54.5
Window 53.8

Microsoft
created on 2019-11-19

Face analysis

Amazon

AWS Rekognition

Age 18-30
Gender Female, 50.2%
Confused 49.5%
Surprised 49.5%
Fear 49.5%
Angry 49.5%
Sad 49.6%
Disgusted 49.5%
Calm 50.2%
Happy 49.6%

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a man standing in front of a building 89.1%
a man that is standing in front of a building 86.6%
a man standing in front of a window 81.1%