Human Generated Data

Title

[Julia Feininger, Dessau]

Date

1929-1930

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.50.2

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia Feininger, Dessau]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1929-1930

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.50.2

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2022-06-10

Person 97.4
Human 97.4
Furniture 93.4
Chair 83
Plywood 81
Wood 81
Table 75.5

Imagga
created on 2022-06-10

negative 24
sitting 20.6
person 19.3
chair 19
film 18.9
portrait 18.8
model 18.7
people 16.7
hair 16.6
lifestyle 16.6
sculpture 16.1
sexy 16.1
adult 16
statue 15.6
fashion 15.1
photographic paper 14.6
room 14.5
body 14.4
dress 13.5
architecture 13.4
attractive 13.3
stringed instrument 12.5
lady 12.2
man 12.1
face 12.1
one 11.9
pretty 11.9
art 11.4
male 11.3
furniture 10.9
happy 10.6
seat 10.5
ancient 10.4
home 10.4
musical instrument 10.3
monument 10.3
black 10.2
alone 10
old 9.8
photographic equipment 9.7
indoors 9.7
looking 9.6
erotic 9.5
culture 9.4
casual 9.3
smile 9.3
relaxation 9.2
historic 9.2
travel 9.2
marble 9.1
sensuality 9.1
piano 9
building 8.8
sofa 8.7
love 8.7
luxury 8.6
historical 8.5
skin 8.5
barber chair 8.4
vintage 8.3
holding 8.3
tourism 8.2
human 8.2
indoor 8.2
retro 8.2
decoration 8.2
religion 8.1
smiling 8
grand piano 7.9
device 7.8
happiness 7.8
seated 7.8
youth 7.7
stone 7.6
elegance 7.6
traditional 7.5
city 7.5
leisure 7.5
church 7.4
water 7.3
sensual 7.3
blond 7.2
history 7.2
interior 7.1
look 7

Google
created on 2022-06-10

Microsoft
created on 2022-06-10

person 96.6
text 90.8
clothing 88.4
newspaper 66.7
man 63.7
black and white 62

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 34-42
Gender Male, 62.8%
Sad 100%
Surprised 9.9%
Fear 6%
Calm 5.2%
Angry 3.1%
Confused 1.8%
Happy 0.8%
Disgusted 0.6%

Feature analysis

Amazon

Person 97.4%

Categories

Imagga

paintings art 99.1%

Captions

Microsoft
created on 2022-06-10

a man sitting on a bench reading a book 30.2%
a man reading a book 30.1%