Human Generated Data

Title

Untitled (three women at buffet table at party with other women chatting in back)

Date

1953

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9507

Human Generated Data

Title

Untitled (three women at buffet table at party with other women chatting in back)

People

Artist: Martin Schweig, American 20th century

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 99.7
Human 99.7
Person 99.1
Person 99
Person 98.9
Clothing 98.1
Apparel 98.1
Room 97
Indoors 97
Person 95.9
Person 93.1
People 81.8
Furniture 81.8
Dressing Room 81.6
Shop 72
Dress 70.6
Hat 64.4
Female 63.1
Food 62.2
Dessert 62.2
Icing 62.2
Creme 62.2
Cake 62.2
Cream 62.2
Costume 60.6
Table 60.5
Boutique 59.3
Evening Dress 55.5
Gown 55.5
Fashion 55.5
Robe 55.5

Imagga
created on 2022-01-28

groom 42.4
bride 37.6
people 32.3
wedding 31.3
dress 28
love 27.6
couple 26.1
bouquet 24.6
happiness 23.5
adult 23.4
kin 22.9
married 22
women 21.4
person 21.3
man 20.2
two 17.8
celebration 17.5
portrait 17.5
flowers 17.4
marriage 17.1
smiling 16.6
male 16.4
happy 16.3
indoors 15.8
wife 15.2
home 15.2
fashion 15.1
interior 15
lady 14.6
men 14.6
ceremony 14.6
husband 14.4
family 14.2
wed 13.7
indoor 13.7
romance 13.4
cheerful 13
veil 12.7
gown 12.7
mother 12.6
life 12.3
lifestyle 12.3
clothing 12.3
sitting 12
flower 11.5
smile 11.4
face 11.4
day 11
romantic 10.7
together 10.5
pretty 10.5
attractive 10.5
room 10.3
elegance 10.1
holiday 10
human 9.7
bridal 9.7
one 9.7
party 9.5
salon 9.4
rose 9.4
clothes 9.4
event 9.2
house 9.2
holding 9.1
old 9.1
matrimony 8.9
decoration 8.9
loving 8.6
care 8.2
outdoors 8.2
looking 8
cute 7.9
model 7.8
engagement 7.7
luxury 7.7
togetherness 7.6
relaxation 7.5
passion 7.5
senior 7.5
style 7.4
retro 7.4
new 7.3
shop 7.2
color 7.2
child 7.2
sexy 7.2
domestic 7.1

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

person 91
table 89.5
indoor 87.2
window 85.2
candle 51.8
old 45.1
several 11.7

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 99.2%
Calm 43.6%
Sad 42.2%
Confused 5.3%
Surprised 4.8%
Angry 1.4%
Disgusted 1.4%
Happy 0.7%
Fear 0.6%

AWS Rekognition

Age 45-51
Gender Female, 93.4%
Calm 73.8%
Confused 8.4%
Happy 8.1%
Fear 4.9%
Surprised 2.8%
Sad 0.9%
Disgusted 0.8%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a vintage photo of a group of people sitting at a table 91.8%
a vintage photo of a group of people in a room 91.7%
a vintage photo of a group of people sitting around a table 91.6%

Text analysis

Amazon

2
22
T
L
T Y
Y
-
KODVK-EVEEiA