Human Generated Data

Title

Untitled (women dressed as queen and princesses)

Date

1957

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21846

Human Generated Data

Title

Untitled (women dressed as queen and princesses)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

1957

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Apparel 99.7
Clothing 99.7
Person 98.9
Human 98.9
Person 98.4
Person 95.3
Person 94.5
Person 93.8
Person 92.7
Female 91.1
People 88.6
Dress 87.9
Person 86.9
Person 86.5
Person 86.3
Fashion 83.3
Person 82.4
Costume 78.7
Art 78.1
Painting 78.1
Person 76.7
Woman 75.3
Face 69.8
Crowd 66.5
Photography 65
Photo 65
Cloak 63.1
Portrait 62.5
Girl 62.1
Robe 58.4
Gown 58.4
Evening Dress 58.4
Indoors 57.8
Room 56
Person 50.4

Imagga
created on 2022-03-11

groom 79.5
brass 48.1
cornet 43.2
bride 42.4
dress 39.7
wedding 36.8
wind instrument 36
love 31.5
couple 29.6
people 26.7
musical instrument 24.2
married 22
marriage 21.8
gown 19.9
fashion 18.8
man 18.8
happiness 18.8
happy 18.8
person 17.8
bridal 16.5
bouquet 16
veil 15.7
flowers 15.6
celebration 14.3
adult 14.3
wife 14.2
attractive 14
ceremony 13.6
black 13
matrimony 12.8
portrait 12.3
face 12.1
church 12
two 11.8
wed 11.8
romantic 11.6
husband 11.4
new 11.3
romance 10.7
together 10.5
old 10.4
clothing 10.4
clothes 10.3
women 10.3
male 9.9
commitment 9.8
pretty 9.8
lady 9.7
style 9.6
party 9.4
day 9.4
holiday 9.3
smile 9.3
elegance 9.2
flower 9.2
kin 9
family 8.9
sexy 8.8
formal 8.6
model 8.5
hand 8.3
human 8.2
religion 8.1
life 7.9
hair 7.9
boutique 7.8
brunette 7.8
future 7.4
art 7.4
event 7.4
design 7.3
gorgeous 7.2
smiling 7.2
color 7.2
posing 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

wedding dress 97.6
person 97.4
dress 95.3
clothing 93.6
bride 91.1
woman 88.8
standing 77.8
text 73
group 59.5
posing 53.8

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 74.8%
Calm 96.1%
Sad 2.7%
Happy 0.8%
Fear 0.1%
Confused 0.1%
Surprised 0.1%
Disgusted 0.1%
Angry 0%

AWS Rekognition

Age 30-40
Gender Female, 90.4%
Happy 52%
Surprised 42.3%
Calm 3.9%
Sad 0.7%
Confused 0.4%
Disgusted 0.3%
Fear 0.3%
Angry 0.2%

AWS Rekognition

Age 40-48
Gender Female, 59.4%
Happy 94%
Surprised 2.7%
Sad 0.9%
Calm 0.8%
Disgusted 0.5%
Confused 0.4%
Angry 0.3%
Fear 0.3%

AWS Rekognition

Age 29-39
Gender Male, 60.1%
Calm 45.8%
Happy 44.8%
Surprised 3.8%
Disgusted 1.9%
Sad 1.6%
Confused 1%
Fear 0.6%
Angry 0.6%

AWS Rekognition

Age 35-43
Gender Female, 99.6%
Calm 66.4%
Surprised 28.2%
Happy 3.3%
Disgusted 0.7%
Sad 0.5%
Fear 0.4%
Confused 0.3%
Angry 0.3%

AWS Rekognition

Age 38-46
Gender Female, 92%
Happy 66.2%
Calm 25.9%
Surprised 5.5%
Sad 1.6%
Confused 0.3%
Fear 0.2%
Angry 0.2%
Disgusted 0.1%

AWS Rekognition

Age 35-43
Gender Male, 71.7%
Surprised 91.4%
Calm 3.2%
Disgusted 2.9%
Happy 1.1%
Fear 0.5%
Angry 0.4%
Confused 0.3%
Sad 0.2%

AWS Rekognition

Age 26-36
Gender Male, 90.8%
Calm 68.7%
Sad 24.4%
Surprised 2.1%
Happy 1.5%
Confused 1.1%
Fear 0.9%
Angry 0.9%
Disgusted 0.4%

AWS Rekognition

Age 19-27
Gender Male, 95.2%
Happy 97.3%
Surprised 1.8%
Calm 0.3%
Fear 0.3%
Angry 0.1%
Disgusted 0.1%
Sad 0.1%
Confused 0.1%

AWS Rekognition

Age 35-43
Gender Male, 97.1%
Happy 78.9%
Calm 7.7%
Surprised 4.4%
Sad 4%
Disgusted 2.4%
Confused 1.4%
Angry 0.6%
Fear 0.6%

AWS Rekognition

Age 23-33
Gender Male, 99.6%
Calm 99.8%
Sad 0.1%
Surprised 0%
Confused 0%
Happy 0%
Disgusted 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 98.9%
Painting 78.1%

Captions

Microsoft

a group of people posing for a photo 88.4%
a group of people posing for the camera 88.3%
a group of people posing for a picture 88.2%

Text analysis

Amazon

A2
...