Human Generated Data

Title

Untitled (formally dressed man and woman at dining room table)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5261

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (formally dressed man and woman at dining room table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5261

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Furniture 99.7
Person 99.4
Human 99.4
Table 99.1
Meal 99.1
Food 99.1
Person 98.9
Apparel 98
Clothing 98
Dish 97.7
Home Decor 97
Room 96.7
Indoors 96.7
Tabletop 96.1
Dining Room 95.7
Tie 84
Accessories 84
Accessory 84
Tablecloth 83
Chair 79.8
Dress 79.7
Dinner 74.7
Supper 74.7
Female 73.1
Dessert 72
Creme 72
Icing 72
Cake 72
Cream 72
Hat 70.6
Face 69.8
People 69.7
Dining Table 69.5
Flower 67.6
Blossom 67.6
Plant 67.6
Portrait 66.5
Photography 66.5
Photo 66.5
Linen 64.6
Woman 57.4
Glass 55.7

Clarifai
created on 2023-10-26

people 99.7
monochrome 98.5
adult 98.1
man 95.7
wedding 94.1
indoors 94.1
woman 93.8
nostalgia 93.6
group 92.1
cooking 91.9
two 89
table 86.8
restaurant 85.5
retro 84.6
meal 82.7
veil 81
three 78.9
several 78.3
military 77.5
celebration 76.8

Imagga
created on 2022-01-22

person 29.4
man 28.9
people 28.4
couple 24.4
home 23.9
senior 23.4
male 22.7
adult 21.9
indoors 21.1
happy 20.7
together 18.4
smiling 18.1
sitting 18
table 17.8
men 15.5
cheerful 15.4
drink 15
food 14.5
wine 14
dinner 13.7
lifestyle 13.7
meal 13.1
business 12.7
restaurant 12.7
blackboard 12.6
30s 12.5
enjoying 12.3
mature 12.1
women 11.9
day 11.8
holding 11.6
drinking 11.5
party 11.2
patient 11.1
love 11
wedding 11
senior adult 10.9
60s 10.7
bride 10.7
older 10.7
businessman 10.6
talking 10.5
alcohol 10.4
meeting 10.4
two 10.2
eating 10.1
20s 10.1
worker 9.9
team 9.9
attractive 9.8
lunch 9.7
bouquet 9.6
flowers 9.6
waiter 9.3
smile 9.3
room 9.2
gathering 8.9
job 8.8
sixties 8.8
medical 8.8
casual clothing 8.8
middle aged 8.8
two people 8.7
elderly 8.6
friends 8.5
friendship 8.4
portrait 8.4
modern 8.4
hand 8.4
health 8.3
leisure 8.3
inside 8.3
businesswoman 8.2
dress 8.1
lab coat 8.1
kitchen 8
interior 8
work 7.9
kin 7.7
hospital 7.7
retirement 7.7
coat 7.6
marriage 7.6
businesspeople 7.6
chair 7.6
groom 7.3
indoor 7.3
group 7.3
child 7.2
clothing 7.2
nurse 7.1
to 7.1
teacher 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 97.8
wall 95.4
person 91.3
tableware 86.2
candle 84.6
birthday cake 81.5
table 79.7
wedding cake 76.9
woman 72.7
clothing 67.5
black and white 65.9
wedding 53.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Female, 61.4%
Calm 98.9%
Sad 0.4%
Confused 0.3%
Disgusted 0.2%
Happy 0.1%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 28-38
Gender Female, 92.8%
Calm 94.4%
Sad 1.5%
Happy 1.2%
Surprised 1%
Fear 0.8%
Disgusted 0.6%
Confused 0.3%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Tie 84%
Dining Table 69.5%

Categories

Text analysis

Amazon

Er
0024