Human Generated Data

Title

[People looking out window, 235 E. 22nd St.]

Date

late 1930s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.233.2

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[People looking out window, 235 E. 22nd St.]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

late 1930s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 96.6
Person 83.7
Person 83.2
Drawing 77.8
Art 77.8
Female 77.8
Face 77.1
Clothing 69.7
Apparel 69.7
Person 69.2
Girl 65.2
Person 63.2
Person 61.9
Photography 61.3
Photo 61.3
Portrait 61.3
Woman 57.2
Kid 57.2
Child 57.2
Blonde 57.2
Teen 57.2
Sketch 57

Clarifai
created on 2019-11-19

people 99.9
adult 99.4
two 98.9
man 98.2
one 97.9
wear 97.7
woman 95
group together 95
group 91.2
sports equipment 85.9
skill 85.8
action 85.5
outfit 82
vehicle 81.8
three 81.5
music 81.3
position 79.4
dancing 78.1
recreation 77.8
motion 77.4

Imagga
created on 2019-11-19

person 25.4
dress 23.5
portrait 23.3
people 21.2
adult 20.7
male 18.6
man 18.1
fashion 18.1
swing 17.4
bride 17.1
wall 16.3
hair 15.1
model 14.8
snow 14.1
mechanical device 13.8
wedding 13.8
plaything 13.8
black 13.2
world 13.1
outdoor 13
lady 13
human 12.7
love 12.6
posing 12.4
face 12.1
sexy 12
happy 11.9
pretty 11.9
elegance 11.8
clothing 11.7
groom 11.5
marriage 11.4
body 11.2
attractive 10.5
couple 10.5
mechanism 10.4
looking 10.4
one 9.7
cold 9.5
bouquet 9.4
outside 9.4
happiness 9.4
winter 9.4
cute 9.3
skin 9.3
head 9.2
blond 9.2
pose 9.1
outdoors 9
bridal 8.8
married 8.6
child 8.5
performer 8.4
vintage 8.3
girls 8.2
sensuality 8.2
closeup 8.1
cool 8
lifestyle 7.9
dancer 7.9
flowers 7.8
standing 7.8
youth 7.7
old 7.7
females 7.6
fun 7.5
barbershop 7.3
alone 7.3
teenager 7.3
gorgeous 7.3
romance 7.1
romantic 7.1
women 7.1

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

person 90.9
outdoor 88.4
black and white 87.3
text 82.5
sketch 77.2
clothing 69.7
drawing 69.6
human face 64.9
posing 44.7
linedrawing 15.6

Face analysis

Amazon

AWS Rekognition

Age 40-58
Gender Male, 93.4%
Fear 2.7%
Surprised 2.4%
Disgusted 4.2%
Happy 14.4%
Sad 19.1%
Angry 5.7%
Confused 1.3%
Calm 50.1%

Feature analysis

Amazon

Person 83.7%

Captions

Microsoft

a man and a woman standing in front of a window 51.4%
a man and a woman standing in front of a building 51.3%
a person standing in front of a window 51.2%