Human Generated Data

Title

Untitled (bride and bridesmaids, Brookline, Massachusetts)

Date

1946, printed later

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.957

Human Generated Data

Title

Untitled (bride and bridesmaids, Brookline, Massachusetts)

People

Artist: Samuel Cooper, American active 1950s

Date

1946, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.957

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.5
Apparel 99.5
Person 98.8
Human 98.8
Person 98.3
Plant 97.6
Flower Bouquet 95.6
Flower 95.6
Flower Arrangement 95.6
Blossom 95.6
Person 95.2
Person 92
Person 87.1
Wedding 83.1
Gown 82.7
Fashion 82.7
Robe 81.7
Person 78.5
Person 78.3
Female 67.6
Wedding Gown 66.3
Photo 60.6
Photography 60.6
Painting 60.4
Art 60.4
Face 59.2
Portrait 59.2
Bride 56

Clarifai
created on 2023-10-15

people 99.9
group 98.9
woman 98.8
wedding 98.5
adult 97.8
family 97.7
two 96.9
portrait 96.6
wear 95.3
offspring 95.2
facial expression 95.2
three 95.1
four 94.7
retro 94.5
man 94
bride 92
monochrome 91.8
actress 91.3
sibling 89.9
child 86.6

Imagga
created on 2021-12-14

nurse 35.7
brass 21.7
man 21.5
people 20.6
male 17.2
adult 17.1
person 17.1
wind instrument 16.7
dress 15.3
musical instrument 13.6
happy 13.2
clothing 12.5
fashion 12.1
home 12
happiness 11.7
bride 11.5
interior 11.5
couple 11.3
worker 11.2
men 11.2
historic 11
family 10.7
bouquet 10.5
old 10.4
portrait 10.3
decoration 10.3
smile 10
window 9.3
elegance 9.2
wedding 9.2
professional 9.1
black 9
flowers 8.7
standing 8.7
smiling 8.7
love 8.7
house 8.4
room 8.2
mother 8
building 7.9
holiday 7.9
work 7.8
art 7.8
luxury 7.7
clothes 7.5
tourism 7.4
tradition 7.4
retro 7.4
business 7.3
child 7.1
hair 7.1
face 7.1
businessman 7.1
architecture 7
indoors 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

wall 98.5
wedding dress 98.3
text 97.3
bride 97.1
person 92.4
dress 91.5
woman 90.2
clothing 90.2
smile 90.2
human face 82.3
posing 75.2
black 74.7
wedding 70.9
white 62.9
old 48.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-45
Gender Female, 99.6%
Happy 96%
Surprised 0.9%
Angry 0.9%
Fear 0.7%
Disgusted 0.6%
Confused 0.4%
Sad 0.3%
Calm 0.2%

AWS Rekognition

Age 21-33
Gender Female, 99%
Happy 84.7%
Calm 12.1%
Disgusted 1.8%
Confused 0.5%
Surprised 0.3%
Sad 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 22-34
Gender Female, 97.9%
Happy 99.3%
Surprised 0.4%
Angry 0.1%
Fear 0.1%
Disgusted 0.1%
Confused 0%
Sad 0%
Calm 0%

AWS Rekognition

Age 16-28
Gender Female, 98.2%
Calm 85.1%
Sad 10%
Happy 2.6%
Angry 1%
Disgusted 0.3%
Confused 0.3%
Surprised 0.3%
Fear 0.3%

AWS Rekognition

Age 21-33
Gender Female, 99.1%
Calm 75.1%
Sad 20.7%
Happy 1.5%
Angry 0.9%
Surprised 0.6%
Disgusted 0.5%
Confused 0.5%
Fear 0.3%

AWS Rekognition

Age 22-34
Gender Male, 97.6%
Calm 99.1%
Sad 0.4%
Angry 0.3%
Surprised 0%
Happy 0%
Confused 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 23-35
Gender Female, 92.1%
Angry 72.2%
Happy 17%
Disgusted 9%
Calm 0.8%
Fear 0.4%
Surprised 0.2%
Confused 0.2%
Sad 0.2%

AWS Rekognition

Age 31-47
Gender Female, 94.1%
Sad 62.2%
Fear 18.1%
Happy 7.6%
Calm 5.9%
Angry 4.4%
Surprised 1.1%
Confused 0.5%
Disgusted 0.2%

AWS Rekognition

Age 49-67
Gender Female, 62%
Calm 43.4%
Surprised 25.9%
Happy 22%
Sad 3.3%
Angry 2.6%
Fear 1.1%
Confused 1%
Disgusted 0.6%

Microsoft Cognitive Services

Age 37
Gender Female

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Painting 60.4%

Categories

Imagga

paintings art 98.8%

Text analysis

Amazon

70
69%
P81

Google

P81
P81