Human Generated Data

Title

[Julia Feininger]

Date

1931?

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.289.10

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia Feininger]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1931?

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Clothing 99.2
Apparel 99.2
Person 96
Human 96
Face 89.8
Coat 86.1
Overcoat 76
Female 73.9
Jacket 72.3
Home Decor 65.6
Food 65
Meal 65
Hug 61.1
Curtain 59.7
Dish 56.9
Indoors 56.8
Woman 56.6
Glasses 56.5
Accessories 56.5
Accessory 56.5
Girl 55.9

Clarifai
created on 2019-11-19

people 99.3
one 99.1
adult 96.7
man 96.3
portrait 93.7
wear 91.8
vehicle 89.2
indoors 88.7
administration 87.6
window 87.2
war 85.5
side view 83.1
military 82.8
monochrome 82.8
transportation system 82.6
music 82.4
winter 82.2
street 81
technology 80.4
travel 80.2

Imagga
created on 2019-11-19

man 28.2
person 28
television 25.7
people 25.6
home 24.7
male 24.2
adult 22.8
indoors 19.3
telecommunication system 18.3
smiling 18.1
work 17.3
sitting 17.2
business 17
interior 16.8
lifestyle 16.6
happy 16.3
office 15.2
portrait 14.2
face 14.2
computer 14.1
room 13.9
smile 13.5
worker 12.8
attractive 12.6
house 12.5
laptop 12.5
job 12.4
working 12.4
one 11.9
device 11.6
old 11.1
equipment 10.9
hand 10.6
looking 10.4
one person 10.4
book 10.2
alone 10
indoor 10
technology 9.6
black 9.6
machine 9.5
men 9.4
senior 9.4
casual 9.3
domestic 9.2
pretty 9.1
lady 8.9
fireplace 8.8
look 8.8
happiness 8.6
cold 8.6
living 8.5
sit 8.5
occupation 8.2
window 8.2
cheerful 8.1
building 8
handsome 8
to 8
expression 7.7
reading 7.6
fire 7.5
support 7.5
single 7.4
child 7.4
cup 7.4
inside 7.4
gray 7.2
love 7.1
kid 7.1
table 7.1
businessman 7.1

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

person 98.3
window 97.4
text 95.9
black and white 95.3
monochrome 71.8
human face 57.7

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Female, 82.4%
Calm 95.7%
Disgusted 0%
Surprised 0.1%
Angry 0.2%
Fear 0.1%
Happy 0.6%
Sad 3.1%
Confused 0.1%

Feature analysis

Amazon

Person 96%

Captions

Microsoft

a person standing in front of a window 82.5%
a person sitting in front of a window 75%
a person standing in front of a window 74.9%

Text analysis

Amazon

nntto
9V
M9Uh
296 nntto tdaiut
tdaiut
296