Human Generated Data

Title

[Shop window with mannequin]

Date

Late 1940's-1955

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.541.12

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Shop window with mannequin]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

Late 1940's-1955

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.541.12

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-04-20

Person 96.2
Human 96.2
Person 85.1
Nature 73.9
LCD Screen 61.7
Electronics 61.7
Screen 61.7
Monitor 61.7
Display 61.7
Text 60.2
Outdoors 58.7

Clarifai
created on 2021-04-20

people 99.5
no person 97.7
adult 97.3
man 96
monochrome 95.9
street 91.6
wear 90.6
one 89.4
war 88.8
art 88.6
architecture 88.2
vehicle 88
woman 87.8
military 87.7
window 87.5
aircraft 86.5
two 85.9
group together 84
fog 84
dirty 82.5

Imagga
created on 2021-04-20

old 27.2
negative 27.1
wall 22.1
grunge 22.1
film 22
architecture 21.2
stone 20.3
building 17.5
vintage 17.4
texture 17.4
ancient 17.3
photographic paper 17
pattern 15.7
step 14.1
art 13.7
device 13.5
light 13.4
travel 12.7
dirty 12.6
design 12.4
industrial 11.8
history 11.6
photographic equipment 11.3
rock 11.3
cell 11.1
construction 11.1
culture 11.1
support 11
city 10.8
urban 10.5
grungy 10.4
antique 10.4
aged 10
structure 9.9
decoration 9.8
detail 9.7
textured 9.6
black 9.6
color 9.5
man 9.4
industry 9.4
part 9.3
house 9.2
rough 9.1
tourism 9.1
artistic 8.7
concrete 8.6
canvas 8.5
monument 8.4
dark 8.3
backdrop 8.2
landscape 8.2
brown 8.1
metal 8
natural 8
brick 8
sand 7.9
scene 7.8
broken 7.7
window 7.7
sky 7.6
frame 7.5
traditional 7.5
retro 7.4
painting 7.2
tower 7.2
mountain 7.1
surface 7.1

Google
created on 2021-04-20

Microsoft
created on 2021-04-20

indoor 93.2
black and white 77.1
text 76.9
drawing 64.5
fog 50.3
cluttered 10.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 7-17
Gender Male, 73.7%
Calm 73.7%
Happy 16.1%
Sad 7.7%
Angry 0.9%
Fear 0.5%
Surprised 0.5%
Confused 0.5%
Disgusted 0.1%

Feature analysis

Amazon

Person 96.2%

Categories

Captions

Microsoft
created on 2021-04-20

an old photo of a kitchen 47.6%
an old photo of a person 32.6%
a person in a kitchen 32.5%