Human Generated Data

Title

Untitled (two couples at having Campbell's soup at formal dinner party/soup being served by maid)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12061

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two couples at having Campbell's soup at formal dinner party/soup being served by maid)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 99.5
Person 99.5
Person 99.4
Person 99.2
Person 99.2
Person 98.5
Chair 87.1
Furniture 87.1
Food 85.8
Meal 85.8
Restaurant 80.4
Sitting 78.5
Table 78.2
Indoors 75.2
People 74.1
Interior Design 73.4
Dining Table 72.4
Dish 71
Pottery 69.7
Room 67.3
Photo 62.2
Photography 62.2
Dating 61.3
Dinner 58.6
Supper 58.6
Waiter 57.3
Portrait 56.6
Face 56.6
Cafeteria 55.8
Dining Room 55.7
Cafe 55.1

Imagga
created on 2022-01-15

person 44.5
man 43
people 37.4
patient 35.5
male 34.1
room 33.3
adult 31.4
sitting 28.3
businessman 27.4
meeting 27.3
office 27.2
business 26.7
couple 25.3
case 25.1
sick person 24.4
men 24
indoors 23.7
hospital 23.7
together 23.7
businesswoman 23.6
nurse 23.3
home 23.1
table 22.5
working 22.1
happy 21.9
team 21.5
classroom 20.9
colleagues 20.4
smiling 20.3
laptop 20
businesspeople 19.9
group 19.3
work 18.1
computer 17.6
teamwork 17.6
women 17.4
desk 17
30s 16.4
day 15.7
smile 15.7
talking 15.2
professional 15.1
lifestyle 14.5
mature 13.9
20s 13.7
40s 13.6
portrait 13.6
kin 13.2
indoor 12.8
worker 12.7
casual 12.7
discussion 12.7
horizontal 12.6
restaurant 12.3
cheerful 12.2
senior 12.2
executive 12.1
corporate 12
coworkers 11.8
communication 11.8
staff 11.6
two 11
associates 10.8
suit 10.8
teacher 10.7
daytime 10.6
color 10.6
adults 10.4
bright 10
face 9.9
discussing 9.8
days 9.8
family 9.8
job 9.7
specialist 9.7
cooperation 9.7
mid adult 9.6
four 9.6
workplace 9.5
happiness 9.4
clothing 9.4
student 9.1
interior 8.8
education 8.7
females 8.5
friends 8.5
clothes 8.4
presentation 8.4
confident 8.2
technology 8.2
medical 7.9
boardroom 7.9
elementary age 7.9
child 7.9
casual clothing 7.8
conference 7.8
modern 7.7
planning 7.7
project 7.7
elderly 7.7
ethnic 7.6
drink 7.5
holding 7.4
mother 7.4
coffee 7.4
successful 7.3
beverage 7.2
dinner 7.2
love 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

person 98
text 97.8
table 92.4
indoor 89.8
clothing 85
furniture 79
man 74.9
house 60.5
people 59.6
human face 52.7

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Female, 52%
Happy 93.5%
Calm 4.3%
Sad 1.4%
Surprised 0.3%
Angry 0.2%
Confused 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Male, 98.7%
Calm 99.3%
Surprised 0.3%
Happy 0.1%
Confused 0.1%
Angry 0.1%
Sad 0.1%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 87.1%

Captions

Microsoft

a group of people sitting at a table 89%
a group of people sitting around a table 88.3%
a group of people sitting on a table 81.4%

Text analysis

Amazon

5698
livat
PLED livat
PLED
DELIMOCA

Google

8195
8195