Human Generated Data

Title

[Backyard plantings]

Date

1930's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.486.43

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Backyard plantings]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930's

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.486.43

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 95.6
Person 92.5
Urban 92.4
Person 88.4
Person 88.2
Building 84.7
Person 77.5
Horse 69.7
Animal 69.7
Mammal 69.7
Fireman 66.5
Person 63.4
People 62.6
Outdoors 61.8
Nature 56.2

Clarifai
created on 2019-11-19

people 99.8
group 98.5
vehicle 98
monochrome 97.4
war 97.2
many 96.6
group together 96
military 95.4
calamity 94.9
adult 94.6
no person 93.5
transportation system 93.1
man 91.3
smoke 91.2
skirmish 89.7
soldier 88.7
flame 86.8
street 85.7
interaction 84.8
administration 84.4

Imagga
created on 2019-11-19

shopping cart 35.6
handcart 30.7
wheeled vehicle 28.2
silhouette 24.8
dark 24.2
man 20.2
sunset 19.8
light 16.7
people 15.6
person 15.5
sun 15.3
night 14.2
container 13.8
conveyance 13.1
landscape 12.6
water 12
black 11.5
evening 11.2
male 11.2
sky 10.8
tree 10.7
adult 10.4
portrait 10.4
cool 9.8
old 9.8
one 9.7
chair 9.6
seat 9.5
rain 9.4
sunrise 9.4
sport 9.3
travel 9.2
exercise 9.1
morning 9
trees 8.9
world 8.9
swing 8.7
park 8.5
outdoor 8.4
summer 8.4
city 8.3
fashion 8.3
musical instrument 8.2
marimba 8.1
bench 8.1
symbol 8.1
shadow 8.1
wet 8.1
sexy 8
body 8
autumn 7.9
couple 7.8
darkness 7.8
forest 7.8
sea 7.8
fog 7.7
mystery 7.7
studio 7.6
walking 7.6
hot 7.5
passion 7.5
percussion instrument 7.5
style 7.4
vacation 7.4
street 7.4
mechanical device 7.3
dirty 7.2
women 7.1
plaything 7.1
building 7.1
model 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

text 96.8
black and white 91.2
firefighter 86
old 82.4
monochrome 77.8
fire 75
white 74.1
black 67.5
fog 65
street 58.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-37
Gender Female, 50.2%
Disgusted 49.5%
Fear 50.3%
Sad 49.6%
Calm 49.5%
Angry 49.5%
Confused 49.5%
Surprised 49.5%
Happy 49.5%

AWS Rekognition

Age 22-34
Gender Male, 50.1%
Confused 49.5%
Surprised 49.5%
Fear 49.6%
Angry 49.5%
Sad 50.3%
Disgusted 49.5%
Calm 49.6%
Happy 49.5%

AWS Rekognition

Age 9-19
Gender Female, 50.2%
Happy 49.5%
Calm 49.7%
Angry 49.6%
Surprised 49.5%
Disgusted 49.6%
Confused 49.5%
Fear 49.6%
Sad 49.9%

Feature analysis

Amazon

Person 92.5%
Horse 69.7%