Human Generated Data

Title

[Julia Feininger]

Date

1940s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.240.30

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia Feininger]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Furniture 99.7
Person 95.1
Human 95.1
Text 93.8
Computer 73.8
Pc 73.8
Electronics 73.8
Newspaper 55.9
Cradle 55.9
Crib 55.2

Clarifai
created on 2019-11-19

people 99.1
one 98.9
adult 98.6
woman 93.8
indoors 91.2
man 91
wear 90.7
portrait 90.1
furniture 87.2
window 85.8
facial expression 85.5
vehicle 81.2
sit 81.2
technology 78.2
veil 77.9
transportation system 75.3
seat 74.7
computer 74.1
room 74
child 73.6

Imagga
created on 2019-11-19

person 33.1
scholar 28.5
newspaper 26.9
negative 26.6
people 26.2
laptop 24.8
computer 24.6
man 24.3
intellectual 22
portrait 21.4
film 21.1
adult 20.9
home 20.7
indoors 20.2
product 18.6
old 18.1
work 18.1
male 17.7
sitting 17.2
senior 16.9
photographic paper 16.3
happy 15
creation 14.6
elderly 14.4
office 13.9
looking 13.6
retired 13.6
face 13.5
retirement 13.4
business 13.4
smiling 13
mature 13
notebook 13
lifestyle 12.3
room 12.1
indoor 11.9
desk 11.8
working 11.5
education 11.3
one 11.2
technology 11.1
photographic equipment 11
sculpture 10.8
call 10.5
art 10.5
hair 9.5
casual 9.3
floor 9.3
black 9
gray 9
job 8.8
monitor 8.8
businessman 8.8
student 8.7
smile 8.6
learning 8.5
head 8.4
camera 8.3
holding 8.3
human 8.3
school 8.1
keyboard 8
world 8
worker 8
class 7.7
clothing 7.7
expression 7.7
serious 7.6
reading 7.6
hand 7.6
sit 7.6
alone 7.3
table 7.2
active 7.2
women 7.1
screen 7.1
architecture 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

text 99
person 98.2
human face 97.4
black and white 80.7
clothing 77.5
newspaper 70.3

Color Analysis

Feature analysis

Amazon

Person 95.1%

Captions

Microsoft

a person sitting in front of a laptop 64.3%
a person standing in front of a laptop 64.2%
a person looking at a laptop 64.1%

Text analysis

Amazon

XX