Human Generated Data

Title

Untitled (groom removing bride's garter)

Date

1965

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19498

Human Generated Data

Title

Untitled (groom removing bride's garter)

People

Artist: Samuel Cooper, American active 1950s

Date

1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19498

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Human 99.5
Person 99.5
Person 99.4
Person 99.3
Person 99.3
Person 98.1
Clothing 92.5
Apparel 92.5
Person 91.5
Person 90.7
Person 87.7
Person 87.2
Person 75.4
Costume 74.9
Footwear 72.8
Shoe 72.8
Face 72.2
Person 70.1
Dress 70.1
Shoe 69
Female 66.7
Accessory 65
Accessories 65
Sunglasses 65
Advertisement 63.9
Outdoors 63
Poster 62.7
Photo 60.5
Photography 60.5
Musical Instrument 60.1
Leisure Activities 60.1
Guitar 60.1
Clinic 59.6
Plant 59.3
Blossom 59.3
Flower 59.3
Nature 59.1
Collage 57.8
Indoors 56.9
Room 56.9
Art 56.1
Person 46

Clarifai
created on 2019-10-29

people 99.7
many 99.4
crowd 99.3
group 98.4
group together 97
wear 94.4
man 93.5
adult 92.3
audience 91
ceremony 88.4
woman 86.8
leader 86.3
music 84.6
child 84.2
war 83.3
military 79.4
veil 76.8
administration 75.3
dancing 71.2
spectator 68.3

Imagga
created on 2019-10-29

clothing 28
people 22.3
ice 20.5
brassiere 19.7
garment 19
person 18.2
plastic bag 17.8
bride 16.4
bag 16.4
crowd 16.3
coat 16.2
lab coat 15.8
woman's clothing 15.8
undergarment 15.8
man 15.4
men 13.7
group 13.7
covering 13.4
wedding 12.9
consumer goods 12.9
adult 12.3
container 12.3
women 11.9
scene 11.2
black 10.8
water 10.7
male 10.6
white 10.6
couple 10.4
dress 9.9
life 9.9
human 9.7
outdoors 9.7
hands 9.5
love 9.5
happiness 9.4
happy 9.4
snow 8.9
crystal 8.9
groom 8.7
work 8.6
outdoor 8.4
silhouette 8.3
team 8.1
cool 8
celebration 8
business 7.9
together 7.9
sea 7.8
rock 7.8
portrait 7.8
cold 7.7
party 7.7
motion 7.7
winter 7.7
sky 7.6
health 7.6
marriage 7.6
wife 7.6
walking 7.6
fun 7.5
ocean 7.5
landscape 7.4
equipment 7.4
teamwork 7.4
natural 7.4
day 7.1
medical 7.1
medicine 7
travel 7

Google
created on 2019-10-29

Microsoft
created on 2019-10-29

text 91.3
person 88.8
clothing 87.4
man 78.7
black and white 75.3
footwear 70.5
crowd 0.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Female, 50.7%
Confused 45.5%
Disgusted 45.2%
Calm 46.9%
Angry 45.1%
Happy 47.7%
Sad 45.6%
Surprised 48.6%
Fear 45.4%

AWS Rekognition

Age 17-29
Gender Male, 52%
Fear 45.2%
Confused 45.2%
Angry 45.3%
Happy 45.3%
Sad 48.8%
Calm 49.8%
Surprised 45.1%
Disgusted 45.2%

AWS Rekognition

Age 12-22
Gender Male, 53.9%
Happy 45%
Angry 45.1%
Surprised 45.1%
Sad 48.2%
Disgusted 45%
Confused 45.9%
Fear 45.1%
Calm 50.5%

AWS Rekognition

Age 30-46
Gender Male, 53.3%
Disgusted 45%
Calm 47.5%
Angry 45%
Sad 45.7%
Fear 45%
Surprised 45.5%
Confused 50.3%
Happy 45.9%

AWS Rekognition

Age 32-48
Gender Male, 51.7%
Sad 52.9%
Happy 45.1%
Angry 45.5%
Confused 45.1%
Disgusted 45%
Fear 45.1%
Calm 46.3%
Surprised 45%

AWS Rekognition

Age 34-50
Gender Female, 53.4%
Calm 46%
Sad 45.4%
Disgusted 45.1%
Confused 45.2%
Surprised 48.8%
Angry 45.2%
Fear 48.3%
Happy 46%

AWS Rekognition

Age 24-38
Gender Male, 50.5%
Calm 46.8%
Angry 45%
Confused 45.1%
Sad 45.1%
Happy 52.9%
Disgusted 45%
Fear 45%
Surprised 45%

AWS Rekognition

Age 42-60
Gender Male, 51.8%
Surprised 45.3%
Happy 45.3%
Fear 45.2%
Confused 45.6%
Sad 45.9%
Calm 51%
Angry 46.6%
Disgusted 45.2%

Feature analysis

Amazon

Person 99.5%
Shoe 72.8%
Guitar 60.1%

Categories

Imagga

paintings art 99.8%

Text analysis

Amazon

Y133A