Human Generated Data

Title

El Station Interior, 6th and 9th Avenue Lines, Downtown Side

Date

February 6, 1936

People

Artist: Berenice Abbott, American 1898 - 1991

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.176

Copyright

© Berenice Abbott

Human Generated Data

Title

El Station Interior, 6th and 9th Avenue Lines, Downtown Side

People

Artist: Berenice Abbott, American 1898 - 1991

Date

February 6, 1936

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 98.7
Human 98.7
Person 98.6
Leisure Activities 97.1
Piano 95.8
Musical Instrument 95.8
Grand Piano 95.8
Pianist 87.6
Performer 87.6
Musician 87.6
Person 87
Person 83.8
Indoors 58

Imagga
created on 2022-01-23

tramway 71.8
conveyance 64.6
architecture 31.8
city 28.3
wheeled vehicle 24.1
building 24.1
urban 22.7
boat 21.9
vehicle 21.2
streetcar 19.3
gondola 19.1
industry 17.9
transportation 17.9
house 17.5
industrial 16.3
travel 16.2
structure 15.5
forklift 15.1
tourism 14.8
construction 14.5
interior 14.2
modern 14
office 13.6
business 12.8
pump 12.7
water 12.7
equipment 12.7
old 11.8
window 11.2
gas pump 11.1
factory 10.7
steel 10.6
black 10.2
town 10.2
inside 10.1
street 10.1
glass 10.1
station 10.1
canal 9.8
river 9.8
technology 9.6
windows 9.6
sky 9.6
vessel 9.4
facility 9.3
energy 9.2
machine 9.2
transport 9.1
tourist 9.1
people 8.9
car 8.9
indoors 8.8
home 8.8
engineering 8.6
buildings 8.5
silhouette 8.3
vacation 8.2
metal 8
light 8
sea 7.8
wall 7.7
england 7.6
perspective 7.5
train 7.3
reflection 7.3
history 7.2
working 7.1

Google
created on 2022-01-23

Table 89.6
Chair 89.4
Coat 86.6
Window 76.5
Suit 74.3
Building 69.8
Desk 66.5
Monochrome photography 65.7
Font 65.6
Rectangle 64.6
Art 64.4
History 63.9
Sitting 63.5
Room 62.3
Door 61.1
Monochrome 61
Machine 58.8
Photographic paper 51.5

Microsoft
created on 2022-01-23

text 99.2
piano 97
black and white 95.2
indoor 93.4
musical keyboard 58.1
monochrome 54.2
furniture 52.3
shelf 51.7

Face analysis

Amazon

AWS Rekognition

Age 23-33
Gender Male, 99.2%
Calm 43.3%
Confused 27.9%
Angry 15.2%
Surprised 5%
Disgusted 2.9%
Fear 2.4%
Happy 2.2%
Sad 1.1%

AWS Rekognition

Age 34-42
Gender Male, 92.6%
Fear 35.2%
Angry 18.4%
Sad 16.7%
Calm 14.6%
Surprised 9.3%
Happy 3.3%
Disgusted 1.5%
Confused 1%

AWS Rekognition

Age 18-26
Gender Female, 93.5%
Sad 63.8%
Calm 13%
Fear 11.9%
Disgusted 5.8%
Angry 1.9%
Surprised 1.4%
Confused 1.1%
Happy 1.1%

Feature analysis

Amazon

Person 98.7%

Captions

Microsoft

a person standing in front of a store 55.1%
a person standing in front of a store window 45%
a person standing in front of a window 44.7%

Text analysis

Amazon

ENTRANCE
PUSH
NOVEL
LUSH

Google

ENTRANCE
ENTRANCE