Human Generated Data

Title

[Julia Feininger and others visiting a building in California]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1003.179

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia Feininger and others visiting a building in California]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1003.179

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99
Human 99
Person 98.9
Person 98.1
Person 97.3
Apparel 95.9
Clothing 95.9
Overcoat 79.7
Coat 79.7
Bag 78.4
Suit 70.7
Door 61.9
Pedestrian 57.6

Clarifai
created on 2019-11-16

people 100
adult 98.5
group 98.4
two 98.2
group together 98
administration 97
three 96.9
man 96.5
woman 95
leader 94.8
wear 94.1
four 93
child 91.4
five 90
several 89.7
outfit 88.8
uniform 88.3
military 86.9
war 86.6
offense 86.3

Imagga
created on 2019-11-16

barbershop 47.1
shop 44.7
mercantile establishment 33.2
old 22.3
place of business 22.1
man 21.5
building 21.3
window 21.1
city 19.1
architecture 18.7
door 18.1
male 17
people 16.7
urban 16.6
wall 16.2
history 14.3
house 14.2
ancient 12.1
establishment 11.6
travel 11.3
black 11.1
glass 10.9
windows 10.5
hairdresser 10.5
business 10.3
men 10.3
office 10.2
street 10.1
dirty 9.9
entrance 9.7
home 9.6
stone 9.3
historic 9.2
structure 9.2
adult 9.1
vintage 9.1
tourism 9.1
call 8.9
life 8.8
person 8.8
standing 8.7
fashion 8.3
light 8
interior 8
device 7.9
indoors 7.9
construction 7.7
industry 7.7
casual 7.6
historical 7.5
outdoors 7.5
town 7.4
alone 7.3
worker 7.2
work 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

person 98.7
man 96.3
clothing 94.9
standing 94.3
black and white 93.9
door 78.7
text 74.8
monochrome 60

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 39-57
Gender Male, 53.7%
Happy 46.1%
Fear 45%
Sad 45.3%
Confused 45.1%
Disgusted 45.2%
Surprised 45.2%
Calm 52.9%
Angry 45.2%

AWS Rekognition

Age 20-32
Gender Female, 53%
Calm 53.8%
Angry 45%
Surprised 45.1%
Confused 45.1%
Disgusted 45%
Happy 45%
Sad 45.9%
Fear 45%

Feature analysis

Amazon

Person 99%