Human Generated Data

Title

Untitled (people standing at cocktail party)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17870

Human Generated Data

Title

Untitled (people standing at cocktail party)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.9
Apparel 99.9
Dress 99.9
Person 99.7
Human 99.7
Person 99.5
Person 99.1
Person 99
Person 98.6
Person 98.4
Person 97.4
Person 95.5
Female 95.4
Fashion 90.4
Gown 90.4
Evening Dress 90.4
Robe 90.4
Woman 88.5
Dog 81.5
Animal 81.5
Mammal 81.5
Canine 81.5
Pet 81.5
Person 76.2
Suit 73.7
Coat 73.7
Overcoat 73.7
Lamp 69.3
Indoors 68
People 67
Room 63.9

Imagga
created on 2022-02-26

groom 40.6
people 31.8
person 30
bride 29.7
couple 27.9
adult 27.4
wedding 26.7
teacher 25.8
man 25.5
dress 25.3
love 25.2
male 22.7
happy 21.9
educator 20.4
two 19.5
marriage 18
happiness 18
married 17.3
senior 15.9
bouquet 15.3
celebration 15.1
clothing 15
professional 14.7
gown 14.2
together 14
church 13.9
ceremony 13.6
women 13.4
men 12.9
fashion 12.1
old 11.8
suit 11.7
bridal 11.7
portrait 11.6
holiday 11.5
husband 11.4
flowers 11.3
life 11
family 10.7
sexy 10.4
veil 9.8
human 9.7
lady 9.7
new 9.7
business 9.7
businessman 9.7
home 9.6
metropolitan 9.5
wife 9.5
day 9.4
smiling 9.4
clothes 9.4
smile 9.3
tradition 9.2
face 9.2
catholic 9.1
wed 8.8
looking 8.8
hair 8.7
costume 8.6
elegance 8.4
attractive 8.4
fan 8.4
hand 8.4
traditional 8.3
outfit 8.1
cheerful 8.1
romance 8
lifestyle 7.9
commitment 7.9
joy 7.5
religious 7.5
outdoors 7.5
kin 7.4
event 7.4
team 7.2
romantic 7.1
indoors 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

dress 99.5
wedding dress 97.6
person 96.6
clothing 96.3
woman 94.6
bride 93.5
sport 81.9
black and white 77.2
text 74.2
wedding 64.5
dancer 62.8

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 55.7%
Sad 95.5%
Happy 1.6%
Angry 1.2%
Fear 0.8%
Calm 0.4%
Disgusted 0.2%
Surprised 0.2%
Confused 0.2%

AWS Rekognition

Age 40-48
Gender Female, 90.2%
Calm 99.5%
Sad 0.3%
Surprised 0.1%
Happy 0.1%
Angry 0%
Disgusted 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Female, 96.2%
Happy 67.1%
Calm 30.7%
Sad 1%
Confused 0.4%
Angry 0.3%
Surprised 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 25-35
Gender Male, 99.7%
Calm 97%
Happy 1.8%
Surprised 0.4%
Sad 0.4%
Disgusted 0.2%
Angry 0.2%
Confused 0%
Fear 0%

AWS Rekognition

Age 34-42
Gender Male, 74.5%
Calm 82.6%
Sad 8.4%
Confused 2.6%
Angry 1.8%
Surprised 1.6%
Happy 1.4%
Disgusted 1.3%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Dog 81.5%

Captions

Microsoft

a group of people standing in a room 96.4%
a group of people standing in front of a store 85.6%
a group of people standing around each other 85.5%

Text analysis

Amazon

26.
- NACOX
-YT37A'2 - NACOX
-YT37A'2

Google

26.
26.