Human Generated Data

Title

[Lux and Jeanne Feininger in front of Museum of Modern Art]

Date

1944-1946

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.530.11

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Lux and Jeanne Feininger in front of Museum of Modern Art]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1944-1946

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.530.11

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 99.3
Person 99.3
Person 99.2
Building 80.7
Apparel 80.5
Clothing 80.5
Military Uniform 72.3
Military 72.3
Officer 70.7
Nature 70.2
Architecture 69.8
Face 65.6
Pedestrian 64.6
Outdoors 62.3
Person 60.9
Home Decor 59.9
Urban 57.9
Person 49.6

Clarifai
created on 2019-11-19

people 99.9
group 98.8
adult 98.6
group together 98.1
man 96.5
two 96.1
one 94.9
woman 94
monochrome 93.9
several 92.2
many 91.3
leader 91.1
administration 91.1
wear 90.3
three 89.5
four 88.3
outfit 83.3
portrait 82.6
five 79.1
music 78.9

Imagga
created on 2019-11-19

negative 80.9
film 64.7
photographic paper 49.2
architecture 39.3
building 39
photographic equipment 32.9
city 27.4
window 27
old 25.8
wall 23.2
town 19.5
house 19.2
ancient 19
exterior 17.5
urban 16.6
historic 16.5
travel 16.2
facade 16.1
shop 15.7
street 15.6
barbershop 15.3
home 14.4
stone 14.3
structure 13.8
blackboard 13.6
windows 13.4
sky 13.4
vintage 13.2
construction 12.8
glass 12.5
door 12.4
mercantile establishment 12.2
balcony 12
aged 11.8
tourism 11.6
buildings 11.3
brick 11.3
modern 10.5
roof 10.5
high 10.4
art 9.3
business 9.1
decoration 9
antique 8.7
culture 8.5
perspective 8.5
place 8.4
sign 8.3
retro 8.2
new 8.1
place of business 8.1
history 8.1
interior 8
sepia 7.8
palace 7.7
architectural 7.7
medieval 7.7
finance 7.6
historical 7.5
classic 7.4
detail 7.2
office 7.2
tower 7.2

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

text 96.9
person 88.5
clothing 86.4
human face 77.9
black and white 74.1
man 68

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 12-22
Gender Female, 51.6%
Disgusted 45.6%
Confused 45.2%
Fear 45.1%
Surprised 45.1%
Calm 45.4%
Angry 53.1%
Happy 45.2%
Sad 45.3%

AWS Rekognition

Age 13-25
Gender Female, 50.3%
Angry 45.4%
Happy 45%
Sad 45.9%
Calm 53.5%
Surprised 45%
Fear 45%
Disgusted 45%
Confused 45%

Feature analysis

Amazon

Person 99.3%

Categories

Text analysis

Amazon

HARTLEY
FEININGER HARTLEY
FEININGER
N
akax

Google

FEININGER HARTLEY
FEININGER
HARTLEY