Human Generated Data

Title

Untitled (men and women seated around dinner table)

Date

1939

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5009

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women seated around dinner table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5009

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.1
Human 99.1
Person 99
Person 98.1
Person 97.5
Person 97.2
Clothing 89.8
Apparel 89.8
People 82.5
Indoors 75.4
Furniture 74.4
Clinic 72.7
Room 72.5
Text 64.3
Sitting 58.3
Suit 57.3
Coat 57.3
Overcoat 57.3
Photography 56.4
Photo 56.4

Clarifai
created on 2023-10-26

indoors 98.2
people 98.2
adult 97.5
man 96.9
woman 91.3
chair 91
sit 89.7
monochrome 88
group 86.8
furniture 84.2
home 81.9
table 81.9
dining room 79.5
window 76.4
horizontal 75.3
side view 74.4
scientist 73.9
togetherness 73.5
room 71.9
three 68.3

Imagga
created on 2022-01-22

counter 42
man 41
people 35.1
male 34
office 32.6
person 31.8
adult 31
businessman 28.3
happy 28.2
business 27.9
clinic 27.3
smiling 26.8
colleagues 26.2
professional 25.6
indoors 25.5
businesspeople 23.7
sitting 23.2
meeting 22.6
shop 22.5
men 22.3
group 21.8
businesswoman 20.9
team 20.6
room 19.9
women 19.8
job 19.5
couple 19.2
work 18.8
patient 18.7
teamwork 18.5
corporate 18
cheerful 17.9
indoor 17.3
30s 17.3
barbershop 16.9
computer 16.8
modern 16.8
together 16.6
occupation 16.5
casual 16.1
home 16
associates 15.7
mid adult 15.4
talking 15.2
mercantile establishment 15.2
doctor 15
senior 15
smile 15
day 14.9
table 14.9
20s 14.7
laptop 14.6
lifestyle 14.5
worker 14.3
portrait 14.2
working 14.1
executive 13.6
medical 13.2
mature 13
color 12.8
coworkers 12.8
horizontal 12.6
holding 12.4
restaurant 12.4
happiness 11.8
two people 11.7
desk 11.5
coat 11.5
bright 11.4
face 11.4
looking 11.2
hospital 11
conference 10.7
discussion 10.7
four 10.5
clothing 10.2
case 10.2
two 10.2
lab coat 10.1
place of business 10.1
suit 9.9
attractive 9.8
40s 9.7
interior 9.7
health 9.7
building 9.7
corporation 9.6
education 9.5
manager 9.3
teacher 9.1
success 8.8
casual clothing 8.8
staff 8.6
nurse 8.6
ethnic 8.6
communication 8.4
employee 8.3
care 8.2
handsome 8
30 35 years 7.9
good mood 7.8
standing 7.8
middle aged 7.8
cooperation 7.7
females 7.6
service 7.6
technology 7.4
specialist 7.4
salon 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

window 95.7
person 93.8
text 85.9
vase 81.7
furniture 80.6
house 79.1
table 78
clothing 70.7
man 50.4
old 42.1

Color Analysis

Face analysis

Amazon

Google

Imagga

AWS Rekognition

Age 47-53
Gender Female, 57.7%
Calm 89.6%
Confused 4.5%
Happy 2.3%
Sad 2%
Surprised 0.7%
Disgusted 0.5%
Angry 0.3%
Fear 0.1%

AWS Rekognition

Age 27-37
Gender Male, 97.9%
Calm 93.9%
Happy 2.3%
Surprised 1.1%
Confused 0.9%
Sad 0.5%
Angry 0.5%
Fear 0.4%
Disgusted 0.3%

AWS Rekognition

Age 29-39
Gender Female, 88.4%
Calm 99.3%
Happy 0.3%
Sad 0.1%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 31-41
Gender Male, 99.4%
Calm 36.3%
Sad 36.1%
Confused 12.2%
Happy 6.3%
Angry 4.4%
Fear 2.6%
Surprised 1.5%
Disgusted 0.8%

AWS Rekognition

Age 28-38
Gender Male, 99.2%
Calm 65.2%
Surprised 18.1%
Happy 7.1%
Sad 4.8%
Angry 1.5%
Fear 1.2%
Disgusted 1.1%
Confused 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Imagga

Traits no traits identified

Feature analysis

Amazon

Person
Person 99.1%
Person 99%
Person 98.1%
Person 97.5%
Person 97.2%

Categories

Imagga

interior objects 99.8%

Captions

Clarifai
created by general-english-image-caption-blip on 2025-05-03

a photograph of a black and white photo of a man and woman sitting at a table -100%

Google Gemini

Created by gemini-2.0-flash on 2025-04-29

Here is a description of the image:

The image is an indoor shot of a group of people seated around a dining table, presumably for a meal. It appears to be an older photograph or a photo with a vintage filter, as indicated by the grayscale and overall aesthetic.

A man in a light-colored suit and round glasses is prominently seated at the table, facing the camera. His demeanor suggests a formal occasion or portrait. To his left, another woman is seated, visible in profile, with a gentle expression.

Other people are seated around the table as well. The table is laid with dinnerware, including plates, glasses, and cutlery. There is a decorative centerpiece in the middle of the table.

In the background, there's a cabinet or sideboard with decorative items on display. Two sets of French doors allow light to filter into the room. Above the cabinet, there's a framed picture.

Overall, the image presents a scene of a formal gathering, possibly a family dinner or a social occasion, set in a traditionally styled interior.

Created by gemini-2.0-flash-lite on 2025-04-29

Here's a description of the image:

Overall Impression:

The image is a negative of a photograph, likely dating back to the early to mid-20th century. It captures a group of people gathered around a dining table inside what appears to be a well-appointed home. The lighting is diffused, casting a ghostly effect on the scene, typical of negatives.

Setting:

  • Room: The scene is set in a dining room. The room features elegant moldings on the walls and ceiling.
  • Furniture: A large round dining table is the focal point, covered with a tablecloth. A sideboard is visible behind the table, adorned with glassware and decorative items.
  • Windows/Doors: French doors with multiple glass panes are present, leading to the outdoors. The doors are likely letting in daylight and contributing to the overall illumination.
  • Decor: A painting hangs on the wall and a small picture in the corner behind the dining table.

People:

  • Number: There are at least seven people visible, all seated around the table.
  • Appearance: They appear to be dressed in formal attire, suits and light-colored dresses. Their expressions are difficult to discern due to the negative effect.
  • Activity: The people seem to be engaged in a meal or a formal gathering.

Composition:

  • The composition is centered on the dining table and the people around it.
  • The French doors and the sideboard provide depth and context to the scene.

Overall, the image depicts a formal gathering within a luxurious setting, offering a glimpse into a past era.

Text analysis

Amazon

11418
ar
NAGOY

Google

|| 4 18-
||
4
18-