Human Generated Data

Title

[Shop window]

Date

after 1938

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.232.25

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Shop window]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

after 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.232.25

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 98.2
Person 98.2
Clothing 92.2
Apparel 92.2
Door 84.4
Indoors 83.1
Interior Design 83.1
Coat 71.3
Overcoat 71.3
Room 69.9
Person 63
Suit 61.5
Silhouette 60.5
Train 60
Transportation 60
Vehicle 60

Clarifai
created on 2019-11-19

people 99.7
monochrome 98.7
adult 96.6
street 95.8
man 94.3
one 94.1
window 94
door 93.6
woman 93.1
doorway 91.6
music 90.8
portrait 90.7
room 89.4
light 89.4
offense 89.3
analogue 88
administration 86.8
group 86.6
indoors 86.3
two 86.2

Imagga
created on 2019-11-19

elevator 30.7
device 30.6
lifting device 24.6
man 24.2
old 16.7
male 15.6
building 14.7
guillotine 14.6
city 14.1
black 14
urban 14
instrument of execution 13.5
architecture 13.5
business 11.5
person 11.4
people 11.1
street 11
work 11
vintage 10.7
equipment 10.7
window 10.7
light 10.7
pay-phone 10.6
ancient 10.4
instrument 10
prison 9.9
room 9.2
portrait 9.1
office 9.1
telephone 9
interior 8.8
wall 8.5
adult 8.4
attractive 8.4
house 8.4
town 8.3
fashion 8.3
metal 8
posing 8
businessman 7.9
corporate 7.7
industry 7.7
grunge 7.7
statue 7.6
power 7.6
dark 7.5
executive 7.5
one 7.5
clothing 7.5
style 7.4
suit 7.3
door 7.3
industrial 7.3
success 7.2
dirty 7.2
machine 7.2
correctional institution 7.2
history 7.2
passenger 7.1
working 7.1

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

person 93.3
black and white 89.4
clothing 86.1
man 71.4
text 71.3
monochrome 64.6
street 52.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Female, 50.1%
Angry 45.5%
Happy 45.3%
Surprised 45.3%
Sad 47.5%
Disgusted 45.1%
Calm 46.8%
Fear 49%
Confused 45.4%

Feature analysis

Amazon

Person 98.2%
Train 60%

Text analysis

Amazon

FOUNED

Google

FOU
FOU