Human Generated Data

Title

Untitled (two women at a restaurant table near a clock on a mantle)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4577

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women at a restaurant table near a clock on a mantle)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4577

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.4
Human 99.4
Clothing 97.9
Apparel 97.9
Hat 97.9
Person 93.3
Person 89.9
Indoors 88.1
Room 87.1
Furniture 76.7
Person 72.8
People 65.6
Living Room 61.3
Photography 60.7
Photo 60.7
Restaurant 58.4

Clarifai
created on 2023-10-15

people 99.6
furniture 98.9
room 97.8
indoors 96.8
chair 96.1
woman 95.5
dining room 95.4
table 95.1
sit 93.4
child 93.4
adult 93.3
family 92.7
seat 91.5
man 90.7
group 90.3
home 86
monochrome 80.6
desk 75.4
mirror 73.6
education 71.4

Imagga
created on 2021-12-14

barbershop 55
interior 50.4
shop 48.5
room 46.4
salon 39.8
home 39.1
furniture 38
chair 36.9
mercantile establishment 34.7
table 34.1
indoors 31.6
house 29.2
modern 26.6
window 24.6
place of business 23.2
decor 23
design 21.9
luxury 19.7
inside 19.3
indoor 18.3
desk 17.9
floor 17.7
office 17.3
wall 17.1
people 16.7
wood 16.7
decoration 16.6
case 16.4
comfortable 16.2
light 16
seat 16
man 15.5
apartment 15.3
lamp 15.3
restaurant 15
person 14.5
style 14.1
domestic 13.7
elegance 13.4
hotel 13.4
work 13.3
architecture 13.3
kitchen 12.6
bedroom 12.6
dining 12.4
male 12.1
elegant 12
establishment 11.6
barber chair 11.2
sitting 11.2
business 10.9
couch 10.6
adult 10.6
residential 10.5
computer 10.4
lifestyle 10.1
3d 10.1
relaxation 10
mirror 9.5
bed 9.5
living 9.5
smiling 9.4
glass 9.3
fashion 9
stylish 9
chairs 8.8
happy 8.8
lighting 8.7
sofa 8.6
meeting 8.5
dinner 8.4
portrait 8.4
old 8.4
counter 8.3
worker 8.1
family 8
building 7.9
wooden 7.9
tables 7.9
food 7.9
color 7.8
nobody 7.8
two 7.6
cabinet 7.6
estate 7.6
laptop 7.3
professional 7.2
life 7
hall 7

Google
created on 2021-12-14

Photograph 94.3
Black 89.8
Hat 87.9
Chair 85.1
Fedora 84.3
Style 83.9
Black-and-white 81.9
Art 77.1
Vintage clothing 76
Classic 75.6
Monochrome photography 75.1
Monochrome 74.5
Snapshot 74.3
Room 71.8
Font 69.7
Stock photography 67.3
Sun hat 65
Picture frame 64.1
History 63.2
Sitting 60.8

Microsoft
created on 2021-12-14

text 99.5
indoor 95.8
furniture 95.2
laptop 92.8
table 86.4
vase 78.8
chair 72.8
house 66.5
fireplace 61.3
black and white 55.5
interior 51.6
desk 11.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Female, 53.1%
Calm 57.2%
Happy 25.6%
Sad 14.2%
Surprised 1.3%
Angry 0.9%
Confused 0.4%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 35-51
Gender Male, 89.3%
Sad 69.4%
Confused 22.8%
Calm 6.3%
Happy 0.8%
Angry 0.3%
Disgusted 0.2%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 49-67
Gender Female, 55%
Sad 68.2%
Calm 21%
Happy 9.7%
Fear 0.4%
Angry 0.3%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.4%
Hat 97.9%

Categories

Imagga

interior objects 99.7%

Text analysis

Amazon

119335
114335

Google

119335.
YAGON-YT37A2-NAMT2A
114335 119335. YAGON-YT37A2-NAMT2A
114335