Human Generated Data

Title

Untitled (people on steps, NYC)

Date

c.1950

People

Artist: Mary Lowber Tiers, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15895.4

Human Generated Data

Title

Untitled (people on steps, NYC)

People

Artist: Mary Lowber Tiers, American active 1940s

Date

c.1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Handrail 99.8
Banister 99.8
Person 93.4
Human 93.4
Person 93.2
Person 91.8
Furniture 79.4
Railing 73.1
Indoors 69.1
Interior Design 69.1
Text 65.8
Shoe 64.4
Clothing 64.4
Footwear 64.4
Apparel 64.4
Person 61.9
Floor 58.1
Art 57.3
Living Room 57.2
Room 57.2
Staircase 51.5
Person 50.7

Imagga
created on 2022-02-05

refrigerator 49.8
white goods 47.4
home appliance 37.5
architecture 30.5
house 28.4
appliance 26.8
home 24.7
building 23.1
window 22.4
wall 21.4
interior 21.2
sculpture 18
room 16.4
old 16
balcony 14.5
structure 14.2
city 14.1
design 14.1
modern 14
decoration 13.7
art 13.2
light 12.7
durables 12.6
ancient 12.1
travel 12
tourism 11.5
architectural 11.5
dishwasher 11.5
historic 11
stone 11
windowsill 10.7
style 10.4
luxury 10.3
town 10.2
sill 10.2
column 10.1
furniture 10
marble 10
door 9.7
indoors 9.7
culture 9.4
monument 9.3
support 9.1
indoor 9.1
history 8.9
glass 8.8
antique 8.7
urban 8.7
residential 8.6
construction 8.5
historical 8.5
device 8.5
famous 8.4
floor 8.4
exterior 8.3
inside 8.3
landmark 8.1
facade 8.1
new 8.1
structural member 8
empty 7.7
windows 7.7
apartment 7.7
statue 7.6
estate 7.6
temple 7.6
wood 7.5
frame 7.5
vintage 7.4
vacation 7.4
street 7.4
negative 7.3
metal 7.2
lifestyle 7.2
decor 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.6
black and white 79.2
building 65.4
white 64.8

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 97%
Calm 98.6%
Sad 1%
Happy 0.2%
Angry 0.1%
Fear 0.1%
Disgusted 0%
Surprised 0%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 93.4%
Shoe 64.4%
Staircase 51.5%

Captions

Microsoft

a person standing in front of a window 35.2%
a person standing next to a window 27%
a person sitting in a room 26.9%