Human Generated Data

Title

Untitled (wedding party outside church)

Date

c. 1955

People

Artist: Clement McLarty, American active 1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19719

Human Generated Data

Title

Untitled (wedding party outside church)

People

Artist: Clement McLarty, American active 1960s

Date

c. 1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19719

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 100
Apparel 100
Dress 98.7
Human 98.2
Person 98.1
Person 97.7
Person 97.2
Robe 96.2
Fashion 96.2
Female 95.2
Person 95.1
Gown 94.5
Wedding 94
Street 93.7
City 93.7
Urban 93.7
Road 93.7
Building 93.7
Town 93.7
Person 93.4
Bridegroom 93.1
Person 90.6
Car 87.8
Transportation 87.8
Vehicle 87.8
Automobile 87.8
Person 87.7
Woman 85.2
Bride 84.7
Wedding Gown 84.7
Suit 84
Overcoat 84
Coat 84
Outdoors 83.9
Nature 78.9
Chair 76.3
Furniture 76.3
Face 72.7
Wheel 68.9
Machine 68.9
People 66
Photography 65.8
Photo 65.8
Portrait 63.6
Pedestrian 63.1
Path 61.1
Girl 60.8
Person 60.4
Alley 59.9
Alleyway 59.9
Person 58.5
Plant 57
Shorts 56.8
Tuxedo 56.5
Shoe 55.8
Footwear 55.8
Person 54.4

Clarifai
created on 2023-10-22

people 99.9
wedding 99.2
bride 98.8
adult 98.2
woman 97.9
wear 97.5
group 97.1
veil 96.9
dress 95.2
group together 95.2
actress 95
man 94.7
street 94
many 93.4
monochrome 90.8
groom 90.3
ceremony 88.7
dancing 88.7
actor 86.6
dancer 85.5

Imagga
created on 2022-03-05

people 26.8
man 23.5
groom 21.5
world 18.4
male 17.1
couple 16.5
person 16.2
love 15.8
city 15
travel 14.8
bride 14.4
adult 13.7
dress 13.5
building 13
wedding 12.9
spectator 12.8
business 12.8
portrait 12.3
together 12.3
life 12
happy 11.9
architecture 11.7
family 11.6
tourism 11.5
group 11.3
two 11
silhouette 10.8
urban 10.5
marriage 10.4
men 10.3
women 10.3
happiness 10.2
vacation 9.8
romantic 9.8
old 9.8
walking 9.5
sitting 9.4
smiling 9.4
black 9
businessman 8.8
looking 8.8
water 8.7
married 8.6
walk 8.6
tourist 8.4
summer 8.4
sky 8.3
park 8.2
history 8
lifestyle 7.9
sea 7.8
room 7.7
wife 7.6
professional 7.5
traditional 7.5
monument 7.5
outdoors 7.5
street 7.4
historic 7.3
landmark 7.2
transportation 7.2
religion 7.2
holiday 7.2

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 99.2
black and white 90.9
person 86.5
clothing 78.3
woman 67
people 59
dress 53.5
clothes 32.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 98.8%
Calm 78.7%
Happy 14.5%
Disgusted 1.9%
Sad 1.3%
Angry 1%
Confused 1%
Fear 0.9%
Surprised 0.7%

AWS Rekognition

Age 36-44
Gender Female, 92.3%
Sad 75.1%
Confused 16.6%
Calm 3.2%
Happy 3%
Angry 0.7%
Surprised 0.6%
Disgusted 0.4%
Fear 0.3%

AWS Rekognition

Age 24-34
Gender Female, 84.4%
Calm 50.1%
Confused 16.5%
Happy 10.4%
Angry 8%
Sad 6.6%
Fear 4.2%
Disgusted 2.3%
Surprised 2%

AWS Rekognition

Age 36-44
Gender Male, 82.5%
Calm 99.5%
Happy 0.3%
Disgusted 0%
Confused 0%
Surprised 0%
Sad 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 35-43
Gender Female, 80.3%
Happy 87.8%
Calm 11.5%
Sad 0.2%
Confused 0.2%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 39-47
Gender Female, 59.6%
Confused 39.5%
Calm 24.1%
Sad 21.6%
Happy 8.3%
Surprised 2.4%
Disgusted 1.4%
Fear 1.4%
Angry 1.2%

AWS Rekognition

Age 23-31
Gender Female, 97.3%
Fear 97.9%
Calm 0.8%
Sad 0.4%
Surprised 0.3%
Happy 0.3%
Confused 0.2%
Disgusted 0.1%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person
Car
Wheel
Shoe
Person 98.1%
Person 97.7%
Person 97.2%
Person 95.1%
Person 93.4%
Person 90.6%
Person 87.7%
Person 60.4%
Person 58.5%
Person 54.4%
Car 87.8%
Wheel 68.9%
Shoe 55.8%

Text analysis

Amazon

27
ATE
COVELIA
VAGOY
LER