Human Generated Data

Title

[Julia Feininger and Walter and Ise Gropius]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1011.134

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia Feininger and Walter and Ise Gropius]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Machine Generated Data

Tags

Imagga
created on 2019-01-31

man 31.6
male 26.4
person 24.6
people 22.3
black 19.2
couple 18.3
adult 16.9
love 15.8
portrait 15.5
men 15.4
bow tie 12.6
silhouette 12.4
musical instrument 11.9
happy 11.9
kin 11.6
human 10.5
attractive 10.5
hair 10.3
smiling 10.1
clothing 10.1
holding 9.9
day 9.4
happiness 9.4
necktie 9.2
hand 9.1
business 9.1
romance 8.9
style 8.9
wind instrument 8.7
child 8.7
groom 8.6
sax 8.6
marriage 8.5
face 8.5
youth 8.5
casual 8.5
power 8.4
dark 8.3
sexy 8
looking 8
lifestyle 7.9
women 7.9
brunette 7.8
smile 7.8
pretty 7.7
expression 7.7
world 7.6
two 7.6
brass 7.3
music 7.3
success 7.2
sunset 7.2

Google
created on 2019-01-31

Microsoft
created on 2019-01-31

man 92.4
person 91.9
window 86.8
old 72.3
posing 42.1
image 39
black and white 20.3
music 15.2

Face analysis

Microsoft

Google

Microsoft Cognitive Services

Age 39
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Captions

Microsoft

a vintage photo of a man standing in front of a window 84.1%
a man and a woman standing in front of a window 70.6%
a man standing in front of a window 70.5%