Human Generated Data

Title

[Station stairs]

Date

late 1930's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.518.5

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Station stairs]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

late 1930's

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.518.5

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-04-05

Human 90.7
Person 89.7
Person 88.8
Person 74.7
Person 71.8
Person 70.9
Person 68.8
Clothing 68.3
Apparel 68.3
People 64
Person 61.9
Person 61.1
Musician 57
Musical Instrument 57
Leisure Activities 56.7
Clinic 55

Clarifai
created on 2021-04-05

people 99.8
group together 99.1
many 98.4
group 97.9
adult 97
vehicle 96.6
man 94.6
woman 94
military 91.6
leader 90.7
music 89.3
war 88.8
outfit 85.9
soldier 85.4
several 84.3
street 83.2
wear 82.4
watercraft 82.1
aircraft 82
crowd 81.5

Imagga
created on 2021-04-05

dishwasher 32.4
shop 25.5
white goods 24.7
mercantile establishment 19.8
home appliance 19.6
barbershop 19.2
industrial 18.1
industry 17.9
appliance 17
factory 16.4
metal 16.1
steel 15.9
manufacturing 14.6
power 14.3
man 13.4
place of business 13.3
machine 13
equipment 12.9
mechanical 12.6
building 12.4
engineering 12.4
light 12
shoe shop 11.7
old 11.1
technology 11.1
inside 11
mechanic 10.7
engine 10.6
work 10.3
black 10.2
people 10
modern 9.8
device 9.8
working 9.7
urban 9.6
architecture 9.4
smoke 9.3
business 9.1
heavy 8.6
male 8.5
city 8.3
vintage 8.3
transportation 8.1
computer 8
iron lung 7.9
enterprise 7.9
iron 7.8
room 7.8
technical 7.7
concrete 7.7
energy 7.6
design 7.4
hospital 7.2
art 7.2
interior 7.1

Google
created on 2021-04-05

Microsoft
created on 2021-04-05

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 46-64
Gender Female, 80.8%
Sad 40.6%
Calm 13%
Happy 11%
Angry 10.7%
Surprised 8.4%
Confused 8.1%
Fear 4.2%
Disgusted 4.1%

AWS Rekognition

Age 35-51
Gender Male, 91.7%
Calm 48.9%
Sad 33.4%
Fear 10.6%
Angry 4%
Happy 2.1%
Surprised 0.5%
Confused 0.4%
Disgusted 0.1%

AWS Rekognition

Age 13-25
Gender Female, 53.6%
Sad 64.7%
Calm 14.7%
Fear 8.6%
Angry 3.9%
Confused 3.8%
Happy 3.1%
Surprised 0.6%
Disgusted 0.6%

Feature analysis

Amazon

Person 89.7%

Captions

Microsoft
created on 2021-04-05

a group of people in a room 75.3%
a group of people standing in a room 65.3%

Text analysis

Google

GENTS
ISTOS GENTS FOOD
ISTOS
FOOD