Human Generated Data

Title

Untitled (people taking food from large buffet)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17207

Human Generated Data

Title

Untitled (people taking food from large buffet)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Food 99.9
Meal 99.9
Human 99.2
Person 99.2
Person 99
Person 98.6
Person 97.7
Person 97.7
Dish 97.3
Restaurant 96.5
Dinner 96.2
Supper 96.2
Person 93.7
Furniture 91.9
Dining Table 91.9
Table 91.9
Person 91.8
People 90.7
Cafeteria 89.2
Buffet 89.2
Face 83.9
Person 83.4
Lunch 75.9
Female 75.1
Photography 64.5
Photo 64.5
Portrait 64.5
Family 63.9
Indoors 62.8
Tabletop 60.8
Plant 60.6
Chair 59.7
Room 59.1
Platter 59
Dining Room 58.9
Eating 57.9
Woman 57.8
Diner 57.1

Imagga
created on 2022-02-26

man 27.5
people 22.3
person 20.5
male 18.5
clothing 17.7
salon 17.5
headdress 17.2
mask 16.5
cap 16.2
helmet 12.4
iron lung 11.9
respirator 11.5
medical 11.5
shower cap 10.7
happy 10.6
hand 10.6
hospital 10.4
health 10.4
patient 10.4
men 10.3
holiday 10
professional 9.8
team 9.8
adult 9.8
human 9.7
black 9.6
work 9.6
doctor 9.4
covering 9.2
portrait 9
bathing cap 9
clinic 8.7
room 8.7
uniform 8.5
world 8.4
protection 8.2
equipment 8
table 8
celebration 8
home 8
medicine 7.9
nurse 7.9
case 7.9
costume 7.7
sitting 7.7
device 7.7
restaurant 7.7
care 7.4
tradition 7.4
worker 7.3
hat 7.3
breathing device 7.2
working 7.1
together 7

Google
created on 2022-02-26

Food 90.2
Tableware 84
Black-and-white 82.7
Art 80.5
Cooking 75.3
Chair 73.8
Monochrome photography 72.6
Event 71.9
Monochrome 71.8
Table 70.9
Cuisine 68.7
Plate 67
Room 66.9
Vintage clothing 64.8
Stock photography 64.2
Visual arts 62.6
History 60.8
Illustration 58.4
Dish 57.7
Suit 56.9

Microsoft
created on 2022-02-26

person 99.9
food 97.3
text 97
indoor 93.5
clothing 92.7
black and white 81.9
man 75.7
people 74.4
preparing 69.6
woman 64.9
serving 59
cooking 44.8
table 34
dining table 6.1

Face analysis

Amazon

Google

AWS Rekognition

Age 18-26
Gender Male, 97.7%
Calm 99.5%
Happy 0.4%
Sad 0.1%
Angry 0%
Disgusted 0%
Confused 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Male, 97.7%
Happy 46.4%
Sad 30.1%
Calm 7.8%
Surprised 5.1%
Fear 3.7%
Confused 3.7%
Disgusted 1.7%
Angry 1.5%

AWS Rekognition

Age 38-46
Gender Male, 99.4%
Calm 99.8%
Happy 0.1%
Sad 0%
Angry 0%
Confused 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 31-41
Gender Female, 90.2%
Calm 99.5%
Sad 0.2%
Surprised 0.1%
Confused 0.1%
Angry 0%
Disgusted 0%
Fear 0%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a group of people standing in a kitchen preparing food 95.6%
a group of people preparing food in a kitchen 95.5%
a group of people standing around a kitchen preparing food 95%