Human Generated Data

Title

Untitled (wedding party outside with cake)

Date

1950

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19562

Human Generated Data

Title

Untitled (wedding party outside with cake)

People

Artist: Samuel Cooper, American active 1950s

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.9
Apparel 99.9
Human 99.3
Person 99.3
Person 99
Person 99
Person 98.7
Person 98.7
Person 97.5
Person 97.2
Person 96.9
Person 96.8
Dress 95.4
Robe 92
Fashion 92
Gown 90.1
Wedding 87.9
Female 87.8
People 85.9
Home Decor 84.9
Plant 77
Wedding Gown 76.6
Suit 75.7
Coat 75.7
Overcoat 75.7
Woman 74.6
Bridegroom 74.2
Face 73.1
Grass 72.6
Chair 69.2
Furniture 69.2
Crowd 68.1
Outdoors 66.2
Photo 65.4
Photography 65.4
Bride 65.3
Portrait 64.2
Shelter 64.2
Countryside 64.2
Rural 64.2
Nature 64.2
Building 64.2
Linen 59.6
Person 48.5

Imagga
created on 2022-03-05

kin 71.2
people 29
couple 24.4
man 22.8
male 19.1
groom 18.8
happiness 17.2
men 16.3
family 16
person 15.3
marimba 15.2
outdoors 14.2
together 14
happy 13.8
two 13.5
women 13.4
bride 13.4
love 13.4
percussion instrument 13.3
adult 13.2
smiling 12.3
wedding 11.9
old 11.8
religion 11.6
home 11.2
summer 10.9
park 10.7
life 10.7
musical instrument 10.4
portrait 10.3
outdoor 9.9
spectator 9.6
walking 9.5
day 9.4
lifestyle 9.4
world 9.2
girls 9.1
tourism 9.1
group 8.9
boy 8.7
married 8.6
friends 8.4
travel 8.4
mother 8.2
dress 8.1
grass 7.9
flowers 7.8
youth 7.7
husband 7.6
garden 7.5
friendship 7.5
fun 7.5
leisure 7.5
holding 7.4
church 7.4
uniform 7.2
black 7.2
father 7.2
history 7.2
to 7.1

Google
created on 2022-03-05

Coat 89.9
Black 89.5
Black-and-white 86.5
Style 84
Window 83.7
Suit 81.8
Plant 80.9
Adaptation 79.3
Monochrome 77.6
Vintage clothing 76.9
Monochrome photography 76.8
Event 73.2
Building 71.1
Classic 69.4
History 66.8
Font 66.1
Tree 65.3
Photo caption 64.7
Stock photography 64.7
House 62.3

Microsoft
created on 2022-03-05

wedding dress 96.6
person 96.6
clothing 95.8
bride 94.4
outdoor 90.9
text 90
man 86.3
woman 85
black and white 83.5
funeral 81.5
standing 78.4
dress 69.6
wedding 68.6
group 55.9
clothes 20.3

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 55.9%
Calm 99.3%
Surprised 0.3%
Confused 0.1%
Fear 0.1%
Disgusted 0.1%
Sad 0.1%
Happy 0%
Angry 0%

AWS Rekognition

Age 38-46
Gender Female, 96.6%
Calm 94.6%
Happy 4.5%
Surprised 0.3%
Fear 0.2%
Sad 0.1%
Disgusted 0.1%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 36-44
Gender Male, 97.4%
Calm 92.3%
Surprised 3.8%
Happy 1.6%
Sad 1.2%
Confused 0.7%
Disgusted 0.2%
Fear 0.2%
Angry 0.1%

AWS Rekognition

Age 35-43
Gender Male, 94.9%
Calm 99%
Sad 0.4%
Confused 0.3%
Surprised 0.1%
Happy 0.1%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 38-46
Gender Male, 97.1%
Calm 55.8%
Happy 25.3%
Surprised 7.6%
Sad 3.9%
Disgusted 3%
Confused 2.7%
Angry 0.9%
Fear 0.9%

AWS Rekognition

Age 40-48
Gender Female, 81.1%
Calm 92.7%
Happy 5.8%
Surprised 0.7%
Fear 0.3%
Confused 0.2%
Disgusted 0.2%
Sad 0.1%
Angry 0%

AWS Rekognition

Age 31-41
Gender Male, 97.9%
Calm 82.9%
Surprised 13.1%
Sad 1.5%
Confused 0.8%
Happy 0.6%
Fear 0.4%
Disgusted 0.4%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a group of people standing in front of a building 94.6%
a group of people standing in front of a store 87.2%
a group of people standing outside of a building 87.1%

Text analysis

Amazon

5
%689

Google

6
E
7 6 E 9
7
9