Human Generated Data

Title

Untitled (woman seated on steps leading from living room)

Date

1962

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10743

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman seated on steps leading from living room)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1962

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Living Room 99.2
Indoors 99.2
Room 99.2
Furniture 98.1
Interior Design 96.1
Couch 87.4
Lobby 87.2
Chair 85.1
Table 74.1
Human 62.1
Person 62.1
Coffee Table 58.8
Lighting 58.2
Waiting Room 56.4
Reception 55.9

Imagga
created on 2022-01-15

furniture 56.7
room 56.2
interior 54.8
table 44.3
house 40.1
modern 37.1
home 35.9
apartment 34.5
chair 32.7
decor 30.9
bathroom 27.9
design 27
architecture 27
lamp 26.8
luxury 26.6
window 25.1
3d 24.8
floor 24.2
desk 23.7
indoors 22.8
light 21.4
wall 20
decoration 19.5
inside 19.3
clean 18.4
glass 18.1
mirror 18.1
tile 17.3
residential 17.2
kitchen 17.2
comfortable 17.2
living 17.1
sink 16.8
wood 16.7
sofa 16.6
seat 16.2
furnishing 15.5
toilet 14.3
nobody 14
indoor 13.7
metal 13.7
domestic 13.6
elegance 13.4
style 13.3
counter 12.9
faucet 12.8
steel 12.4
bath 12.3
shelf 12.3
vase 11.6
equipment 11.5
estate 11.4
appliance 11.3
contemporary 11.3
structure 11.3
comfort 10.6
rendering 10.5
render 10.4
hospital 10.2
barber chair 10.2
relaxation 10
reflection 9.7
wooden 9.7
wash 9.6
hotel 9.5
scene 9.5
office 9.4
rest 9.4
stylish 9
plant 9
home appliance 8.8
chairs 8.8
carpet 8.8
lifestyle 8.7
bright 8.6
lifestyles 8.5
relax 8.4
cabinet 8.3
stove 8.1
shop 8.1
shower 8
machine 8
medical 7.9
urban 7.9
sewing machine 7.8
space 7.8
stainless 7.8
architect 7.7
dining 7.6
fashionable 7.6
city 7.5
heat 7.4
business 7.3
armchair 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.4
house 91.5
furniture 90.1
white 75.4
black and white 65.6
building 61.1
window 55.2
old 42.7

Face analysis

Amazon

AWS Rekognition

Age 39-47
Gender Male, 94.2%
Calm 36.2%
Sad 31.8%
Happy 24.5%
Confused 2.6%
Disgusted 2%
Angry 1.4%
Surprised 0.9%
Fear 0.5%

Feature analysis

Amazon

Chair 85.1%
Person 62.1%

Captions

Microsoft

an old photo of a building 78.1%
an old photo of a bookshelf 54.4%
a person standing in front of a store window 46.8%

Text analysis

Amazon

48031.

Google

YT3RA°2-- XAGON
YT3RA°2--
XAGON