Human Generated Data

Title

Untitled (studio portrait of two women wearing matching dresses holding cards and standin boy with U.S. Navy sailor's suit)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3742

Human Generated Data

Title

Untitled (studio portrait of two women wearing matching dresses holding cards and standin boy with U.S. Navy sailor's suit)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Clothing 99.8
Apparel 99.8
Furniture 99.6
Person 99.4
Human 99.4
Shoe 99.1
Footwear 99.1
Person 99
Person 99
Face 94.6
Shorts 89.9
Smile 86.5
Portrait 76.6
Photography 76.6
Photo 76.6
Outdoors 68.8
Female 68.1
Hat 63
Pants 62.8
People 62.6
Chair 60.9
Shirt 60.2
Sailor Suit 59.4
Child 59.1
Kid 59.1
Table 59
Glasses 57.3
Accessory 57.3
Accessories 57.3
Man 57.3
Door 56.1

Imagga
created on 2022-02-05

kin 32.8
musical instrument 22
man 18.8
people 18.4
statue 17.1
adult 16.9
concertina 16.5
male 15.7
religion 14.3
person 14.3
portrait 14.2
monument 14
couple 13.9
old 13.9
wind instrument 13.9
sculpture 13.8
free-reed instrument 13.2
religious 13.1
dress 12.6
art 12.5
percussion instrument 12.5
two 11
architecture 10.9
pretty 10.5
men 10.3
culture 10.2
black 10.2
face 9.9
history 9.8
catholic 9.7
costume 9.5
historical 9.4
happy 9.4
traditional 9.1
vintage 9.1
lady 8.9
bride 8.6
sitting 8.6
device 8.6
travel 8.4
church 8.3
city 8.3
holding 8.2
tourism 8.2
romantic 8
building 8
smiling 7.9
lifestyle 7.9
women 7.9
love 7.9
color 7.8
ancient 7.8
attractive 7.7
faith 7.6
stone 7.6
outdoors 7.5
park 7.4
water 7.3
detail 7.2
sexy 7.2
marimba 7.2
interior 7.1
summer 7.1
day 7.1
marble 7
happiness 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

wall 96.5
clothing 90.7
person 90.2
text 89.5
standing 77.7
dress 68.2
old 58
posing 38.5

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 77.3%
Calm 90.4%
Disgusted 2.9%
Surprised 2.1%
Angry 1.7%
Sad 1.6%
Happy 0.6%
Confused 0.4%
Fear 0.3%

AWS Rekognition

Age 45-53
Gender Male, 99.3%
Calm 91.1%
Surprised 3.5%
Disgusted 2.5%
Confused 1.3%
Angry 0.4%
Sad 0.4%
Happy 0.4%
Fear 0.3%

AWS Rekognition

Age 26-36
Gender Male, 99.8%
Calm 98.8%
Disgusted 0.5%
Happy 0.3%
Surprised 0.1%
Confused 0.1%
Sad 0.1%
Angry 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Shoe 99.1%
Chair 60.9%

Captions

Microsoft

an old photo of a person 87.5%
old photo of a person 84.9%
a person posing for a photo 79.2%

Text analysis

Amazon

ENNAY