Human Generated Data

Title

[Visitors at California mission]

Date

1936

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.207.5

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Visitors at California mission]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.207.5

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-18

Human 97.7
Person 97.7
Apparel 97.6
Clothing 97.6
Person 97.5
Person 97.4
Person 97
Face 67.3
Coat 66.2
Overcoat 66.2
Hat 65.4
Chef 64.2
People 63.7
Helmet 63.7
Pedestrian 57.2
Sleeve 56.9
Nurse 56.3

Clarifai
created on 2019-11-18

people 99.9
adult 98.8
two 98.6
wear 98.1
group 97.3
uniform 97
man 96.6
veil 95
group together 94.5
three 94.1
leader 93.8
administration 92.9
military 92.1
four 91.3
woman 89.7
one 87.1
many 86.1
several 86
coat 83
outfit 82.8

Imagga
created on 2019-11-18

barbershop 48.8
shop 39.5
old 36.2
door 33.6
building 30.9
mercantile establishment 30.5
wall 29.9
architecture 29.7
house 22.6
ancient 21.6
place of business 20.3
window 19.1
city 19.1
town 16.7
stone 16.1
entrance 14.5
travel 14.1
tourism 14
street 13.8
dirty 13.5
history 13.4
vintage 13.2
brick 12.5
antique 12.1
construction 12
historic 11.9
wooden 11.4
aged 10.9
closed 10.6
establishment 10.6
urban 10.5
detail 10.5
home 10.4
wood 10
doorway 9.8
structure 9.8
village 9.6
culture 9.4
texture 9
metal 8.8
exterior 8.3
retro 8.2
instrument 8
lab coat 8
call 7.8
architectural 7.7
windows 7.7
man 7.4
tradition 7.4
brown 7.4
tourist 7.2
coat 7.2
room 7

Google
created on 2019-11-18

Photograph 96.7
Snapshot 84.1
Standing 80.9
Nurse 61.5
History 60.4
Black-and-white 56.4
Uniform 52.8

Microsoft
created on 2019-11-18

person 98.8
man 96.6
clothing 95.1
outdoor 92.5
black and white 89.7
standing 89.2
door 82.9
text 77.9
white 69.8
old 61.8
posing 57.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Female, 50.4%
Angry 47.2%
Sad 51.2%
Disgusted 45.2%
Happy 45.6%
Fear 45.3%
Surprised 45.1%
Confused 45.4%
Calm 45.1%

AWS Rekognition

Age 43-61
Gender Male, 52.9%
Disgusted 45.1%
Fear 45%
Angry 45%
Confused 45.1%
Surprised 45%
Sad 53.2%
Happy 45.1%
Calm 46.5%

Feature analysis

Amazon

Person 97.7%
Hat 65.4%
Helmet 63.7%

Text analysis

Amazon

N69IT

Google

ENTRENCE
ENTRENCE