Human Generated Data

Title

Untitled (formally dressed men being served food)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7464

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (formally dressed men being served food)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7464

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.6
Human 99.6
Meal 99.5
Food 99.5
Person 98.7
Person 97.1
Dish 95.3
Person 80.3
Clothing 79.4
Apparel 79.4
Cafeteria 76.1
Restaurant 76.1
Person 71.9
Sunglasses 70.8
Accessories 70.8
Accessory 70.8
People 69.3
Buffet 57.1

Clarifai
created on 2023-10-25

people 99.9
adult 99.1
monochrome 97.9
group 97.7
man 97.3
woman 94.8
many 94.2
group together 90
administration 83
child 82.5
several 82
wear 79.7
war 78.6
elderly 75.3
military 75
facial expression 74.8
commerce 73.9
portrait 72.7
indoors 70.2
four 70

Imagga
created on 2022-01-08

man 43.7
couple 35.7
seller 34.9
home 33.5
senior 32.8
people 31.2
male 31.2
sitting 28.3
person 27.6
together 27.2
smiling 26.8
adult 26.4
happy 25.7
meal 24.9
food 24.2
men 23.2
table 22.5
indoors 22
dinner 21.7
lunch 21.5
cheerful 20.3
restaurant 20.3
eating 20.2
waiter 20.1
drink 19.2
mature 18.6
wine 18.5
holding 18.2
enjoying 18
women 15.8
lifestyle 15.2
day 14.9
room 14.9
60s 14.7
30s 14.4
drinking 14.4
family 14.2
having 13.6
professional 13.3
friends 13.2
alcohol 13
dining-room attendant 12.9
grandma 12.9
20s 12.8
sixties 12.8
kitchen 12.5
working 12.4
talking 12.4
stall 12.2
business 12.1
smile 12.1
worker 12
two 11.9
happiness 11.8
colleagues 11.7
businessman 11.5
husband 11.4
wife 11.4
meeting 11.3
employee 11.3
kin 11.2
two people 10.7
older 10.7
elderly 10.5
cooking 10.5
old 10.4
businesspeople 10.4
casual 10.2
senior adult 9.9
40s 9.7
middle aged 9.7
job 9.7
daughter 9.6
adults 9.5
indoor 9.1
plate 9.1
businesswoman 9.1
group 8.9
retired 8.7
mid adult 8.7
work 8.6
glass 8.6
friendship 8.4
hand 8.4
occupation 8.2
outdoors 8.2
beverage 8.1
life 8.1
office 8.1
romantic 8
to 8
patient 8
medical 7.9
dining room 7.9
four people 7.9
casual clothing 7.8
boy 7.8
waist 7.7
busy 7.7
retirement 7.7
four 7.7
dining 7.6
mother 7.6
chair 7.6
coffee 7.4
care 7.4
inside 7.4
breakfast 7.3
cup 7.2
celebration 7.2
team 7.2
grandfather 7.2
handsome 7.1
portrait 7.1
face 7.1
interior 7.1

Google
created on 2022-01-08

Food 96.4
Photograph 94.3
White 92.2
Table 89.7
Plate 89.4
Tableware 89.2
Hat 84.2
Black-and-white 82.5
Cooking 80.6
Basket 74.5
Snapshot 74.3
Monochrome photography 73.7
Event 72.5
Cuisine 71.5
Dish 69.2
Monochrome 68.4
Dessert 68
Finger food 67.1
Culinary art 67
Stock photography 66.6

Microsoft
created on 2022-01-08

person 99.9
text 98.1
clothing 96
man 95.2
indoor 88.7
black and white 88.5
food 85
woman 59.3
preparing 54.9
cooking 23.2
several 11

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 54-64
Gender Male, 60.7%
Calm 80.1%
Disgusted 10.6%
Sad 3.8%
Surprised 2%
Happy 1.2%
Fear 1%
Angry 0.9%
Confused 0.5%

AWS Rekognition

Age 29-39
Gender Male, 92.9%
Calm 57.8%
Sad 22.3%
Happy 10.7%
Disgusted 2.2%
Confused 2.1%
Angry 1.9%
Surprised 1.5%
Fear 1.4%

AWS Rekognition

Age 53-61
Gender Female, 58.2%
Calm 98.2%
Surprised 0.5%
Angry 0.3%
Happy 0.3%
Disgusted 0.2%
Fear 0.2%
Sad 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Sunglasses 70.8%

Text analysis

Amazon

3818

Google

8882
8882