Human Generated Data

Title

[Unidentified people]

Date

1940's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.588.24

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Unidentified people]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940's

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-20

Apparel 100
Clothing 100
Dress 99.4
Human 99.1
Person 99.1
Person 99.1
Person 98.5
Coat 96.5
Overcoat 96.5
Suit 96.5
Female 91.1
Person 90.9
Person 85.6
Automobile 85
Vehicle 85
Car 85
Transportation 85
Robe 82.5
Fashion 82.5
Face 80.6
People 78.6
Woman 77.3
Gown 76.7
Tuxedo 75
Plant 73.4
Wedding 70.5
Photography 68.8
Photo 68.8
Portrait 68.3
Outdoors 68.1
Hat 67.4
Girl 66.5
Bridegroom 66.1
Kid 65.8
Child 65.8
Grass 65.1
Wedding Gown 64
Path 57.6
Man 57.2
Road 56.6
Nature 55.2

Clarifai
created on 2019-11-20

people 100
group together 99.4
child 99.4
group 99.2
several 98
two 97.6
four 97.4
adult 97.2
three 95.8
many 95
wear 94.9
five 94.8
man 94.4
administration 93.8
recreation 92.3
military 90.9
offspring 90.3
sibling 88.9
outfit 88.3
woman 87.8

Imagga
created on 2019-11-20

groom 47.4
people 26.8
person 26.3
couple 24.4
park 22.2
man 22.2
love 19.7
male 17.9
wedding 17.5
bride 17.4
world 17.1
dress 15.4
summer 14.8
walking 14.2
happiness 13.3
outdoor 13
kin 12.9
two 12.7
outdoors 12.1
life 11.7
portrait 11
family 10.7
adult 10.6
bouquet 10.4
day 10.2
smiling 10.1
child 9.8
old 9.8
sun 9.7
married 9.6
marriage 9.5
spring 9.4
youth 9.4
tree 9.2
clothing 8.9
sunlight 8.9
trees 8.9
happy 8.8
together 8.8
fan 8.7
men 8.6
garden 8.4
mother 8.4
church 8.3
sky 8.3
romantic 8
smile 7.8
forest 7.8
walk 7.6
hand 7.6
wife 7.6
human 7.5
city 7.5
landscape 7.4
suit 7.4
building 7.3
spectator 7.3
lifestyle 7.2
active 7.2
religion 7.2
romance 7.1
women 7.1
grass 7.1
to 7.1
follower 7.1
autumn 7

Google
created on 2019-11-20

Microsoft
created on 2019-11-20

text 98.5
clothing 97.5
outdoor 94.7
person 94.1
footwear 61.4
man 59.3
woman 55.3
black and white 54.7

Face analysis

Amazon

AWS Rekognition

Age 41-59
Gender Female, 50.4%
Disgusted 45%
Happy 45.2%
Surprised 45%
Calm 45.2%
Angry 45.4%
Confused 45.1%
Fear 45.5%
Sad 53.5%

AWS Rekognition

Age 28-44
Gender Male, 52.4%
Angry 45%
Surprised 45%
Sad 54.6%
Happy 45%
Disgusted 45%
Fear 45.1%
Calm 45.1%
Confused 45%

AWS Rekognition

Age 44-62
Gender Female, 51.1%
Fear 45.3%
Angry 45.5%
Surprised 45.1%
Happy 45.7%
Confused 45.2%
Disgusted 45.1%
Calm 51.7%
Sad 46.3%

Feature analysis

Amazon

Person 99.1%
Car 85%

Captions

Microsoft

a group of people standing in front of a building 60.8%
a group of people standing next to a building 60.7%
a person standing next to a building 57.6%