Human Generated Data

Title

[Andreas and Tomas Feininger]

Date

1930's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.437.75

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Andreas and Tomas Feininger]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930's

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.437.75

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-12-13

Human 95
Person 93.9
Furniture 86.3
Face 86.3
Bed 68.9
Room 55.3
Indoors 55.3

Clarifai
created on 2023-10-15

people 99.9
one 99.2
adult 99
two 98.2
man 97.2
room 95
furniture 94.8
administration 93.4
child 92.8
portrait 92.1
sit 91.7
three 88.9
medical practitioner 88.8
group 86.6
indoors 84.2
leader 84.2
wear 82.7
home 82.3
education 80.3
family 79.6

Imagga
created on 2021-12-13

blackboard 24.9
television 24.3
old 20.2
people 19.5
person 18.8
happy 18.2
child 18
male 17.2
grunge 17
world 17
telecommunication system 17
man 15.4
aged 15.4
portrait 14.9
black 14.4
room 13.7
wall 13.7
smile 13.5
grungy 13.3
smiling 13
vintage 12.4
hair 11.9
antique 11.2
mother 11.2
art 11.1
happiness 11
adult 10.7
outdoors 10.4
texture 10.4
home 10.4
space 10.1
retro 9.8
family 9.8
boy 9.6
fun 9
kid 8.9
little 8.8
looking 8.8
textured 8.8
negative 8.7
parent 8.7
love 8.7
ancient 8.6
day 8.6
building 8.5
business 8.5
togetherness 8.5
blond 8.4
senior 8.4
film 8.4
head 8.4
office 8.2
indoor 8.2
one 8.2
rough 8.2
school 8.1
dirty 8.1
cute 7.9
design 7.9
empty 7.7
men 7.7
affection 7.7
pretty 7.7
structure 7.6
hand 7.6
females 7.6
frame 7.5
back 7.3
water 7.3
cheerful 7.3
grandma 7.3
lady 7.3
girls 7.3
color 7.2
gray 7.2
childhood 7.2
women 7.1
interior 7.1
businessman 7.1
architecture 7

Google
created on 2021-12-13

Microsoft
created on 2021-12-13

text 92.8
clothing 90.6
black and white 88.9
person 88.7
house 85.4
old 68.1
white 67.5
human face 52.7
man 52.5

Color Analysis

Feature analysis

Amazon

Person 93.9%

Captions