Human Generated Data

Title

[Lux and Jeanne Feininger in front of Museum of Modern Art]

Date

1944-1946

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.530.12

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Lux and Jeanne Feininger in front of Museum of Modern Art]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1944-1946

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.530.12

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Person 99.8
Human 99.8
Person 99.5
Person 95.7
Military 95.1
Military Uniform 94.5
Person 89.2
Officer 86.2
Nature 85.8
Outdoors 84.8
Person 77.3
Person 75.8
Soldier 68.1
Clothing 62.8
Apparel 62.8
Armored 61.2
Army 61.2
Sailor Suit 58.3
Ice 56.8

Clarifai
created on 2019-11-19

people 99.9
group 98.5
adult 97.9
group together 95.5
several 93.9
administration 93.5
man 93.1
many 92.1
leader 90.6
two 89.2
woman 88.8
wear 87.5
three 86.9
four 85.3
five 84.9
war 81.6
veil 79.7
outfit 77.7
vehicle 76
one 74.8

Imagga
created on 2019-11-19

shop 21.6
mercantile establishment 18.9
architecture 18.9
building 17.8
old 17.4
people 17.3
barbershop 17.2
newspaper 16.4
city 15
person 14.4
room 13.6
home 13.6
negative 13
daily 12.3
urban 12.2
place of business 12.2
man 12.1
travel 12
product 11.9
ancient 11.2
art 11.1
dress 10.8
film 10.5
patient 10.3
wall 10.3
male 10
house 10
tourism 9.9
sculpture 9.7
business 9.7
monument 9.3
creation 9.2
vintage 9.1
family 8.9
interior 8.8
indoors 8.8
adult 8.7
culture 8.5
face 8.5
color 8.3
historic 8.2
decoration 8.2
photographic paper 8.1
column 7.9
scene 7.8
window 7.6
traditional 7.5
nurse 7.5
famous 7.4
holding 7.4
town 7.4
hospital 7.3
detail 7.2
landmark 7.2
black 7.2
celebration 7.2
transportation 7.2
religion 7.2
history 7.2
portrait 7.1
work 7.1
day 7.1

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

text 96
clothing 86.2
person 76.3
black and white 75.9
snow 63.7
man 63.1
old 57.5
cooking 50.2
preparing 45.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 51-69
Gender Male, 97.6%
Calm 14.4%
Happy 0.2%
Fear 1.9%
Surprised 0.5%
Confused 2.2%
Disgusted 1.4%
Angry 5.1%
Sad 74.4%

AWS Rekognition

Age 42-60
Gender Female, 50.2%
Angry 49.7%
Confused 49.5%
Sad 50%
Surprised 49.5%
Calm 49.5%
Happy 49.6%
Disgusted 49.6%
Fear 49.6%

AWS Rekognition

Age 10-20
Gender Female, 50.4%
Happy 49.6%
Angry 50%
Confused 49.5%
Fear 49.5%
Disgusted 49.5%
Surprised 49.5%
Sad 49.6%
Calm 49.8%

Feature analysis

Amazon

Person 99.8%

Categories

Text analysis

Amazon

HARTLEY
FEININGER
R
T
L2

Google

FEININGER HARTLEY
FEININGER
HARTLEY