Human Generated Data

Title

[Old woman with chickens]

Date

1931-1933

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.248.19

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Old woman with chickens]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1931-1933

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.248.19

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Person 97.3
Human 97.3
Outdoors 94.2
Nature 90.6
Countryside 76.8
Yard 76.1
Animal 72.8
Rural 71.3
Female 70.4
Bird 69.9
Drawing 66.7
Art 66.7
Girl 63.2
Fowl 61.8
Face 61.8
Cat 61.6
Pet 61.6
Mammal 61.6
Poultry 60.3
Building 57.2
Shelter 57.2

Clarifai
created on 2019-11-19

people 100
adult 99
one 98.4
two 98.3
group 96.7
child 95.3
war 94.1
man 93.9
administration 91.3
home 91.2
group together 90.3
military 90.2
wear 89.7
three 89
soldier 86.8
offspring 86.2
woman 85.8
vehicle 82
four 81.3
leader 80.4

Imagga
created on 2019-11-19

stone 30.8
gravestone 30.7
old 28.6
memorial 23.5
barber chair 21.4
building 21.2
house 20.9
chair 19.1
wall 18.3
architecture 16.4
structure 15.8
window 15.6
ancient 14.7
cemetery 14.1
seat 13.8
grunge 13.6
door 13.4
vintage 13.2
antique 13
wood 12.5
park 12.3
rural 12.3
container 12.2
light 11.4
forest 11.3
barbershop 11.3
travel 11.3
town 11.1
street 11
dirty 10.8
furniture 10.8
retro 10.7
vessel 10.1
city 10
texture 9.7
autumn 9.7
village 9.6
bucket 9.5
shop 9.5
plants 9.3
barrow 9.2
road 9
history 8.9
landscape 8.9
handcart 8.5
art 8.5
bench 8.3
countryside 8.2
wheeled vehicle 8.2
water 8
country 7.9
black 7.8
sepia 7.8
empty 7.7
medieval 7.7
path 7.6
vehicle 7.5
traditional 7.5
tree 7.4
tourism 7.4
peace 7.3
danger 7.3
aged 7.2
scenery 7.2
home 7.2
brick 7.1
season 7
textured 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

grass 99.5
outdoor 96.9
black and white 96.3
text 86.9
house 81.3
waste container 76.3
monochrome 67.3
grave 63.9
old 58.1
stone 5.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 31-47
Gender Male, 52.3%
Fear 45.1%
Sad 47.3%
Angry 45.2%
Calm 51.5%
Happy 45.5%
Surprised 45.2%
Confused 45.1%
Disgusted 45%

Feature analysis

Amazon

Person 97.3%
Cat 61.6%

Captions

Microsoft
created on 2019-11-19

a vintage photo of a person 86.8%
a vintage photo of a girl 75.7%
an old photo of a person 75.6%