Human Generated Data

Title

Untitled (people getting food from buffet table)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16453

Human Generated Data

Title

Untitled (people getting food from buffet table)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16453

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Clothing 99.7
Apparel 99.7
Icing 99.7
Cake 99.7
Dessert 99.7
Cream 99.7
Food 99.7
Creme 99.7
Dress 99.3
Person 98.3
Human 98.3
Person 96.4
Person 95.8
Face 90.6
Chair 89.1
Furniture 89.1
Female 85.2
Suit 83
Coat 83
Overcoat 83
Meal 82.2
Sweets 82
Confectionery 82
People 81.9
Gown 80.2
Fashion 80.2
Robe 78.9
Dish 78.2
Table 78
Woman 72.2
Portrait 71.1
Photography 71.1
Photo 71.1
Wedding 69
Plant 68.9
Wedding Gown 63.1
Dining Table 60.3
Home Decor 59.9
Tablecloth 57.9
Torte 57.4
Birthday Cake 55.9

Clarifai
created on 2023-10-28

people 100
adult 99.3
woman 98.9
group 98.9
two 98.8
three 96
man 95.4
group together 94.9
four 93
monochrome 90.8
offspring 90.3
room 90.2
home 88.4
elderly 88.1
several 86.8
many 86
commerce 85.5
leader 85.1
child 84.6
wear 84.1

Imagga
created on 2022-02-11

man 34.9
male 31.2
people 30.7
person 27.1
businessman 25.6
adult 25
couple 24.4
business 22.5
teacher 20.3
men 19.7
office 19.4
professional 18.2
smiling 18.1
sitting 18
executive 17.7
happy 17.5
women 17.4
group 16.1
table 15.6
room 15.4
job 15
worker 14.4
portrait 14.2
work 14.2
meeting 14.1
senior 14.1
indoors 14.1
life 14
musical instrument 13.9
indoor 13.7
two 13.5
bartender 13.4
interior 13.3
cheerful 13
percussion instrument 12.9
groom 12.4
corporate 12
home 12
modern 11.9
suit 11.8
talking 11.4
smile 11.4
together 11.4
desk 11.3
manager 11.2
mature 11.2
educator 11.1
happiness 11
communication 10.9
team 10.7
old 10.4
love 10.3
lifestyle 10.1
color 10
holding 9.9
classroom 9.9
chair 9.8
bride 9.6
employee 9.4
teamwork 9.3
businesswoman 9.1
waiter 8.9
to 8.8
colleagues 8.7
businesspeople 8.5
togetherness 8.5
marimba 8.2
new 8.1
family 8
conference 7.8
education 7.8
building 7.7
husband 7.6
wife 7.6
career 7.6
house 7.5
clothing 7.4
phone 7.4
handsome 7.1

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

text 93.5
person 90.6
indoor 90.4
clothing 88.6
woman 84.2
human face 76.2
table 72.7
furniture 66.7
vase 60.3
old 51.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 64.3%
Calm 92.3%
Surprised 2.2%
Happy 2%
Confused 1.1%
Angry 1%
Sad 0.7%
Disgusted 0.4%
Fear 0.3%

AWS Rekognition

Age 48-54
Gender Male, 81.6%
Calm 99.3%
Happy 0.6%
Sad 0.1%
Confused 0%
Disgusted 0%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 21-29
Gender Male, 96%
Happy 68%
Sad 22.1%
Calm 6.1%
Confused 1.4%
Angry 1.2%
Surprised 0.5%
Fear 0.5%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.3%
Person 96.4%
Person 95.8%

Text analysis

Amazon

1906
1956
Anniversary
14

Google

14 1906 1956
14
1906
1956