Human Generated Data

Title

[Julia Feininger]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1011.103

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia Feininger]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Machine Generated Data

Tags

Imagga
created on 2019-01-31

electronic instrument 100
musical instrument 100
device 66.8
male 23.4
person 23.3
people 21.7
man 20.2
black 19.8
adult 18.1
portrait 16.8
music 16.2
work 14.9
attractive 12.6
night 12.4
smiling 12.3
working 11.5
studio 11.4
laptop 11
lifestyle 10.8
happy 10.6
computer 10.4
dark 10
business 9.7
table 9.5
smile 9.3
face 9.2
holding 9.1
handsome 8.9
percussion instrument 8.6
sitting 8.6
men 8.6
model 8.6
fire 8.4
smoke 8.4
old 8.4
hand 8.3
human 8.2
style 8.2
light 8
rock 7.8
concert 7.8
youth 7.7
musical 7.7
one 7.5
silhouette 7.4
lady 7.3
danger 7.3
pose 7.2
sexy 7.2
office 7.2
women 7.1
worker 7.1

Google
created on 2019-01-31

Orange 89.9
Fun 70.4
Room 65.7
Photography 62.4
Finger 60.7
Space 56.6

Microsoft
created on 2019-01-31

man 95.4
person 95.2
dark 36.4
music 36.4
art 18.5

Face analysis

Microsoft

Google

Microsoft Cognitive Services

Age 44
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Captions

Microsoft

a man sitting in a dark room 78.8%
a man sitting at a table in a dark room 78.3%
a man sitting in a dark room with a bat 43.9%