Human Generated Data

Title

Untitled (outdoor portrait of three formally dressed couples with hats and bonnets sitting on the ground)

Date

c. 1905

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3906

Human Generated Data

Title

Untitled (outdoor portrait of three formally dressed couples with hats and bonnets sitting on the ground)

People

Artist: Durette Studio, American 20th century

Date

c. 1905

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3906

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.5
Human 99.5
Person 99.3
Person 99.2
Person 99.1
Apparel 98.7
Clothing 98.7
Person 98
Person 97.9
Dress 85.8
Female 81.4
Meal 80.9
Food 80.9
Art 74.5
Face 73
People 72.3
Outdoors 71.3
Smile 69.2
Leisure Activities 68.3
Fashion 67
Robe 67
Gown 67
Evening Dress 67
Photo 66.2
Photography 66.2
Portrait 65.6
Woman 63.6
Person 63.3
Performer 59.5
Plant 59.3
Grass 59.3
Nature 58.8
Vacation 58.2
Costume 56.9
Drawing 56.8
Sitting 56.4
Picnic 55.7

Clarifai
created on 2019-06-01

people 100
group 99.8
adult 98.5
veil 98.5
group together 97.6
several 97.6
many 97.4
wear 97
man 96.7
child 95.1
woman 93.7
five 92.1
four 91.5
lid 87.4
leader 86.9
recreation 85.6
three 84.9
art 84.2
administration 83.9
portrait 83.7

Imagga
created on 2019-06-01

groom 67.7
kin 62.8
couple 24.4
man 22.8
people 22.3
bride 21.2
love 19.7
wedding 19.3
dress 19
two 17.8
male 17.7
married 17.2
person 17
happiness 16.4
outdoor 15.3
outdoors 15
adult 14.9
men 14.6
summer 12.2
water 12
old 11.8
happy 11.3
park 10.7
romantic 10.7
family 10.7
marriage 10.4
wife 10.4
world 10.4
bouquet 10.4
sky 10.2
beach 10.1
husband 9.8
together 9.6
women 9.5
attractive 9.1
portrait 9.1
romance 8.9
smiling 8.7
outside 8.6
sport 8.5
life 8.3
silhouette 8.3
aged 8.1
cheerful 8.1
celebration 8
grass 7.9
day 7.8
gown 7.8
face 7.8
human 7.5
traditional 7.5
art 7.3
sunset 7.2
history 7.2
sand 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

text 96
person 93.9
clothing 92.8
wedding dress 84.9
bride 79.6
man 79.3
woman 78.8
black and white 68.3
smile 62.2
dress 53.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 49-69
Gender Male, 54.5%
Angry 45.3%
Happy 45.3%
Sad 46.1%
Disgusted 45.3%
Confused 45.3%
Calm 52.4%
Surprised 45.3%

AWS Rekognition

Age 38-59
Gender Male, 74%
Angry 2.4%
Happy 2.8%
Sad 2.3%
Disgusted 4.6%
Confused 2.1%
Calm 83.1%
Surprised 2.7%

AWS Rekognition

Age 23-38
Gender Female, 50%
Confused 45.3%
Happy 47.1%
Calm 47.8%
Disgusted 45.4%
Sad 48.3%
Angry 45.4%
Surprised 45.6%

AWS Rekognition

Age 48-68
Gender Male, 73.6%
Calm 80.8%
Surprised 2.4%
Disgusted 1.1%
Happy 9.7%
Sad 3.4%
Confused 1.3%
Angry 1.2%

AWS Rekognition

Age 26-43
Gender Female, 53.2%
Calm 49.6%
Surprised 45.4%
Disgusted 45.2%
Happy 45.8%
Sad 48.1%
Confused 45.3%
Angry 45.5%

AWS Rekognition

Age 35-52
Gender Male, 80.3%
Confused 3%
Surprised 4%
Happy 7.8%
Angry 5.5%
Sad 12.7%
Disgusted 1.9%
Calm 65%

Feature analysis

Amazon

Person 99.5%