Human Generated Data

Title

Untitled (bride and wedding party in front of car)

Date

1946

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19493

Human Generated Data

Title

Untitled (bride and wedding party in front of car)

People

Artist: Samuel Cooper, American active 1950s

Date

1946

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Apparel 99.9
Clothing 99.9
Person 99.2
Human 99.2
Person 98.6
Person 97.8
Person 96.9
Person 96.7
Person 95.6
Person 94.9
Robe 94.6
Fashion 94.6
Gown 92.7
Wedding 87.7
Wheel 86.4
Machine 86.4
Female 85.9
Person 83
Tie 80.6
Accessory 80.6
Accessories 80.6
Overcoat 77.4
Suit 77.4
Coat 77.4
Wedding Gown 77.1
Evening Dress 76.5
Transportation 74.6
Vehicle 74.6
Automobile 74.6
Car 74.6
Woman 72.7
Bride 68.5
Dress 67.2
Face 60.7
People 57.7
Crowd 55.7

Clarifai
created on 2019-10-29

people 99.8
group together 99.1
group 98.2
adult 97.4
woman 96.3
man 96.2
monochrome 96
several 94.6
many 94.5
vehicle 93.7
wear 93.6
street 92.3
administration 89
wedding 88.4
four 87.6
leader 86.9
child 83.7
three 83.6
five 83.4
outfit 83.4

Imagga
created on 2019-10-29

world 46
people 20.1
man 19.5
person 16.8
kin 16.4
adult 15.6
groom 15.3
male 14.2
old 13.2
limousine 12.8
clothing 12.5
black 11.4
travel 11.3
car 11.1
mask 10.9
dress 10.8
city 10.8
history 10.7
scene 10.4
dark 10
tourism 9.9
religion 9.9
business 9.7
portrait 9.7
couple 9.6
men 9.4
happiness 9.4
monument 9.3
uniform 9.1
tourist 9.1
life 9
bride 8.6
architecture 8.6
walking 8.5
two 8.5
suit 8.2
group 8.1
art 7.9
love 7.9
faith 7.7
human 7.5
traditional 7.5
silhouette 7.4
church 7.4
tradition 7.4
street 7.4
wedding 7.4
girls 7.3
protection 7.3
danger 7.3
women 7.1

Google
created on 2019-10-29

Microsoft
created on 2019-10-29

text 99.7
outdoor 97.2
clothing 93.5
wedding dress 93.5
person 89.5
bride 88
car 86
black and white 84.1
woman 79
vehicle 77.5
man 70.8
dress 68.9
land vehicle 57.5

Face analysis

Amazon

AWS Rekognition

Age 24-38
Gender Male, 53.5%
Disgusted 45.1%
Sad 46.2%
Fear 45.2%
Surprised 45.5%
Happy 49.7%
Calm 47.7%
Confused 45.5%
Angry 45.2%

AWS Rekognition

Age 43-61
Gender Male, 52.8%
Sad 50.3%
Confused 45.1%
Happy 45%
Calm 48.2%
Disgusted 45%
Surprised 45.1%
Fear 46.1%
Angry 45%

AWS Rekognition

Age 13-25
Gender Male, 54.5%
Fear 50.2%
Angry 45.4%
Disgusted 45.1%
Calm 46.1%
Happy 46%
Sad 45.8%
Surprised 45.8%
Confused 45.6%

AWS Rekognition

Age 44-62
Gender Male, 53.6%
Confused 45.2%
Surprised 45%
Sad 54.3%
Calm 45.3%
Disgusted 45%
Happy 45%
Fear 45.2%
Angry 45.1%

AWS Rekognition

Age 6-16
Gender Male, 54.1%
Disgusted 46.1%
Confused 45.4%
Fear 45.8%
Happy 45.2%
Sad 51.7%
Angry 45.3%
Surprised 45.1%
Calm 45.5%

AWS Rekognition

Age 19-31
Gender Male, 52.8%
Sad 54.9%
Surprised 45%
Happy 45%
Confused 45%
Fear 45%
Disgusted 45%
Calm 45.1%
Angry 45%

AWS Rekognition

Age 15-27
Gender Female, 51.4%
Fear 45.4%
Happy 45%
Surprised 45.1%
Angry 52.4%
Confused 45.3%
Disgusted 45.1%
Calm 45.5%
Sad 46.2%

Feature analysis

Amazon

Person 99.2%
Wheel 86.4%
Tie 80.6%
Car 74.6%

Captions

Microsoft

a group of people standing in front of a building 91.9%
a group of people posing for a photo 87%
a group of men standing in front of a building 86.9%