Human Generated Data

Title

Untitled (photograph of two men wrestling with seventeen men watching)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3622

Human Generated Data

Title

Untitled (photograph of two men wrestling with seventeen men watching)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.9
Person 99.9
Person 97.7
Apparel 97.4
Clothing 97.4
Person 97.1
Person 96.5
Person 96.4
Person 96.3
Person 96
Person 94.5
Person 94.2
Person 94
Person 93.6
Person 92.3
Person 90.8
People 87.1
Robe 79.9
Fashion 79.9
Gown 76.5
Wedding 75.5
Face 68
Art 67.5
Photography 65.9
Photo 65.9
Portrait 64.4
Wedding Gown 64.2
Person 62.8
Shorts 62.3
Crowd 59.3
Drawing 58.6
Female 58
Plant 57.8
Coat 55.2
Suit 55.2
Overcoat 55.2

Clarifai
created on 2019-06-01

people 99.9
group 99.3
many 98.4
group together 98.2
adult 96.7
man 95
several 93.4
wear 93.2
child 90.6
veil 90.2
wedding 88
ceremony 87.8
woman 87.2
leader 81.9
outfit 80.4
dancing 79
war 77.4
military 75
administration 73.2
crowd 72.4

Imagga
created on 2019-06-01

bride 21.9
people 21.7
groom 19.8
man 19.5
beach 18.8
wedding 18.4
adult 18.2
dress 18.1
sea 18
kin 17.3
outdoors 17.2
sand 15.9
ice 14.3
person 14.2
summer 14.1
water 14
couple 13.9
love 13.4
walking 13.3
snow 13
group 12.1
men 12
winter 11.9
women 11.9
outdoor 11.5
two 11
portrait 11
happiness 11
coast 10.8
holiday 10.7
life 10.7
male 10.6
clothing 10.4
marriage 10.4
celebration 10.4
sky 10.2
day 10.2
ocean 10
married 9.6
happy 9.4
white 8.8
art 8.7
sunny 8.6
cold 8.6
black 8.4
traditional 8.3
fashion 8.3
vacation 8.2
picket fence 8.1
smiling 8
together 7.9
attractive 7.7
negative 7.6
sport 7.6
human 7.5
bag 7.4
plastic bag 7.3
religion 7.2
romantic 7.1
mountain 7.1
face 7.1
travel 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

outdoor 94.3
person 91.9
clothing 89.3
man 75.4
old 74.8
group 64.6
posing 40
clothes 22.5

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Female, 52.5%
Angry 45.7%
Happy 45.5%
Confused 47.6%
Calm 47.8%
Disgusted 45.1%
Sad 47.4%
Surprised 45.8%

AWS Rekognition

Age 26-43
Gender Male, 50.8%
Happy 45.7%
Surprised 45.3%
Angry 45.4%
Confused 45.3%
Calm 47.3%
Sad 47.7%
Disgusted 48.3%

AWS Rekognition

Age 35-52
Gender Female, 54.2%
Disgusted 45.9%
Confused 45.4%
Sad 45.7%
Happy 48.6%
Angry 45.7%
Surprised 45.6%
Calm 48.1%

AWS Rekognition

Age 26-43
Gender Female, 54.6%
Angry 45.1%
Happy 45.1%
Calm 54%
Surprised 45.1%
Sad 45.3%
Confused 45.2%
Disgusted 45.2%

AWS Rekognition

Age 26-43
Gender Female, 54.3%
Happy 50.4%
Confused 45.7%
Angry 46%
Disgusted 46.1%
Surprised 45.6%
Sad 45.7%
Calm 45.5%

AWS Rekognition

Age 11-18
Gender Female, 54.7%
Angry 45.2%
Happy 45.1%
Sad 47.3%
Disgusted 45.3%
Confused 45.4%
Calm 51.2%
Surprised 45.4%

AWS Rekognition

Age 16-27
Gender Female, 53.7%
Angry 45.2%
Surprised 45.6%
Calm 46.2%
Sad 45.6%
Confused 45.4%
Disgusted 51.9%
Happy 45.2%

AWS Rekognition

Age 17-27
Gender Female, 54.7%
Disgusted 45.5%
Angry 46.1%
Confused 46.4%
Sad 48%
Happy 45.2%
Calm 47.6%
Surprised 46.3%

Feature analysis

Amazon

Person 99.9%

Captions

Microsoft

a group of people in an old photo of a man 51.8%
a group of people posing for a photo 51.7%
an old photo of a group of people posing for the camera 51.6%