Human Generated Data

Title

[Waterfalls of the Housatonic River, with Julia Feininger in foreground, Falls Village, Connecticut]

Date

early 1940s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.583.11

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Waterfalls of the Housatonic River, with Julia Feininger in foreground, Falls Village, Connecticut]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

early 1940s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-20

Person 99.6
Human 99.6
Clothing 97.2
Apparel 97.2
Food 84.9
Meal 84.9
Face 84.8
Nature 82.5
Outdoors 78.2
Dish 76.4
Plant 73.1
People 64.9
Female 62.1
Girl 61.1
Hat 60.8
Machine 59.6
Spoke 59.6

Clarifai
created on 2019-11-20

people 99.9
adult 99.4
wear 97.7
one 96.9
monochrome 96.1
man 96.1
landscape 96.1
portrait 96.1
vehicle 92.6
river 85.8
war 84.9
art 84
soldier 83.7
calamity 82.6
group 82.2
woman 81.2
two 79.4
skirmish 78.7
military 78.2
storm 78

Imagga
created on 2019-11-20

world 24.7
man 24.2
dark 21.7
light 18.4
person 18
smoke 15.8
male 15.1
adult 14.2
fire 14.1
people 13.4
sexy 12.8
black 12.1
hot 11.7
night 11.5
flame 11.3
sky 10.8
happy 10.7
attractive 10.5
expression 10.2
fireplace 9.9
device 9.9
darkness 9.8
portrait 9.7
body 9.6
weapon 9.5
love 9.5
water 9.3
model 9.3
safety 9.2
danger 9.1
industrial 9.1
fashion 9
sunset 9
posing 8.9
burn 8.7
work 8.6
passion 8.5
beach 8.4
energy 8.4
silhouette 8.3
sport 8.2
protection 8.2
landscape 8.2
flamethrower 8
lifestyle 7.9
scene 7.8
outdoors 7.6
enjoy 7.5
one 7.5
cave 7.4
warm 7.4
sensuality 7.3
dress 7.2
face 7.1
happiness 7.1
travel 7

Google
created on 2019-11-20

Microsoft
created on 2019-11-20

outdoor 99
grass 98.5
human face 83.7
text 74.1
black and white 71.8
clothing 71.3
person 59.2
cave 57.3
monochrome 51.8

Face analysis

Amazon

AWS Rekognition

Age 8-18
Gender Male, 66.6%
Calm 0.1%
Surprised 0.2%
Happy 0%
Angry 0.1%
Disgusted 0.2%
Sad 0.7%
Fear 0.2%
Confused 98.5%

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a person that is standing in the grass 60.2%
a person standing on top of a grass covered field 46.1%
a person is standing in the grass 46%