Human Generated Data

Title

Untitled (bride throwing bouquet to jumping women in reception hall)

Date

1955

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9551

Human Generated Data

Title

Untitled (bride throwing bouquet to jumping women in reception hall)

People

Artist: Martin Schweig, American 20th century

Date

1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9551

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Clothing 99.9
Apparel 99.9
Human 98.8
Person 98.1
Person 98
Person 97.9
Dance Pose 96.8
Leisure Activities 96.8
Person 95.4
Robe 94.6
Fashion 94.6
Gown 93.9
Wedding 90
Dance 88.6
Person 87.3
Person 87.2
Person 86.8
Female 86.4
Person 82.8
Person 82.6
Wedding Gown 78.8
Evening Dress 77.5
Bride 73.6
Person 72.8
Woman 72
Dress 70.6
Person 62.4
Portrait 60
Photography 60
Face 60
Photo 60
Indoors 58.4

Clarifai
created on 2023-10-27

wedding 99.3
bride 99.1
people 99
veil 98.4
dancing 97
woman 95.3
dress 95.2
group 93.2
groom 91.7
wear 91.5
dancer 88
bridal 87.3
adult 87.1
fashion 85.4
illustration 84.5
ceremony 84.5
man 83
gown 80.7
many 79.7
bridesmaid 79.1

Imagga
created on 2022-01-28

negative 28.5
bride 25.2
wedding 24.8
film 21.6
dress 20.8
groom 18.6
photographic paper 16.7
day 16.5
people 16.2
love 15.8
marriage 14.2
outdoors 13.4
sky 13.4
hall 13.3
couple 13.1
winter 12.8
snow 12.5
art 12.1
person 11.8
man 11.4
adult 11.2
photographic equipment 11.2
landscape 11.1
outdoor 10.7
ice 10.4
boutique 10
water 10
city 10
outfit 9.9
romantic 9.8
gown 9.8
celebration 9.6
women 9.5
happiness 9.4
happy 9.4
life 9.4
world 9.2
travel 9.1
fashion 9
summer 9
married 8.6
cold 8.6
wife 8.5
male 8.5
two 8.5
portrait 8.4
human 8.2
style 8.2
structure 8
modern 7.7
attractive 7.7
park 7.4
design 7.3
glass 7.2
dancer 7.2
sea 7.1
building 7.1

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

dance 99
text 98.7
person 89.8
outdoor 88.9
clothing 85
dress 83.6
wedding dress 76.9
woman 66.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Female, 96%
Calm 90%
Fear 3.4%
Sad 3.1%
Surprised 1.1%
Angry 1%
Happy 0.6%
Confused 0.4%
Disgusted 0.4%

AWS Rekognition

Age 23-33
Gender Male, 96%
Fear 63.2%
Calm 20.9%
Sad 8.7%
Happy 2.1%
Confused 1.8%
Surprised 1.6%
Angry 1%
Disgusted 0.8%

AWS Rekognition

Age 16-22
Gender Male, 82.6%
Calm 85.5%
Fear 6%
Sad 3.9%
Confused 1.2%
Disgusted 1.1%
Angry 1%
Happy 0.9%
Surprised 0.4%

AWS Rekognition

Age 18-26
Gender Female, 90.3%
Calm 53.9%
Sad 21.3%
Confused 15.4%
Happy 4.2%
Disgusted 2%
Angry 1.3%
Fear 1%
Surprised 0.8%

AWS Rekognition

Age 20-28
Gender Female, 85.4%
Calm 40.2%
Sad 16.4%
Fear 14.3%
Angry 8.2%
Confused 7.6%
Surprised 5.4%
Happy 5.1%
Disgusted 2.7%

AWS Rekognition

Age 23-31
Gender Female, 89.2%
Calm 98.3%
Sad 0.8%
Happy 0.2%
Angry 0.2%
Confused 0.2%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 11-19
Gender Female, 85.9%
Happy 62.5%
Calm 19.7%
Fear 5.7%
Sad 5.7%
Confused 2.8%
Disgusted 1.3%
Surprised 1.1%
Angry 1%

AWS Rekognition

Age 24-34
Gender Male, 99.4%
Happy 31.9%
Sad 31.5%
Calm 24.2%
Angry 3.3%
Confused 3%
Fear 2.6%
Surprised 2.1%
Disgusted 1.4%

Feature analysis

Amazon

Person
Person 98.1%
Person 98%
Person 97.9%
Person 95.4%
Person 87.3%
Person 87.2%
Person 86.8%
Person 82.8%
Person 82.6%
Person 72.8%
Person 62.4%

Categories

Text analysis

Amazon

o
of
- 5 2
P of o - 5 2
P
KODAK-EA--EIEW

Google

2 00 S 2
2
00
S