Human Generated Data

Title

Untitled (photograph of three women seated around small table with small dog on table's top)

Date

c. 1910-1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3618

Human Generated Data

Title

Untitled (photograph of three women seated around small table with small dog on table's top)

People

Artist: Durette Studio, American 20th century

Date

c. 1910-1920

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.1
Human 99.1
Person 98.4
Person 94.8
Clothing 94.7
Apparel 94.7
Furniture 91.7
Chair 91.7
Art 70.8
Painting 70.8
People 70.6
Photo 62.4
Photography 62.4
Face 59.9
Female 59.6
Nurse 55.6
Performer 55.1
Person 54.4
Person 42.3

Clarifai
created on 2019-06-01

people 100
group 99.2
adult 98.4
wear 98.2
print 97.2
outfit 96.3
man 95.9
leader 94.5
two 94.3
woman 92.6
illustration 91.8
furniture 89.9
veil 89.5
administration 89.4
three 88.8
several 87.9
group together 87.8
many 86.4
engraving 86
art 85.3

Imagga
created on 2019-06-01

kin 33.5
statue 26.5
old 25.1
groom 22.6
religion 20.6
history 20.6
monument 20.5
architecture 19.5
sculpture 19.5
people 18.4
marble 18.4
church 17.6
person 17
art 16.8
ancient 16.4
catholic 16.1
bride 15.6
dress 15.4
god 15.3
world 14.9
portrait 14.2
love 14.2
mother 13.9
stone 13.6
tourism 13.2
wedding 12.9
building 12.8
cathedral 12.5
faith 12.4
historical 12.2
couple 12.2
antique 12.1
man 12.1
famous 12.1
detail 12.1
culture 12
historic 11.9
gown 11.7
city 11.6
vintage 11.6
adult 11.5
travel 11.3
face 10.7
male 10.6
religious 10.3
happiness 10.2
newspaper 10.2
two 10.2
tourist 10
landmark 9.9
lady 9.7
metropolitan 9.6
bouquet 9.4
senior 9.4
column 9.1
aged 9
fashion 9
angel 8.8
women 8.7
saint 8.7
spirituality 8.6
husband 8.5
memorial 8.3
creation 8.1
holy 7.7
head 7.6
closeup 7.4
symbol 7.4
clothing 7.3
product 7.3
home 7.2
indoors 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

text 99.1
sketch 98.1
drawing 96.6
person 91.4
clothing 90.9
old 88.3
painting 85.8
woman 78.9
black 68.8
dress 60.9
human face 52.2

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Female, 93%
Sad 37.5%
Surprised 5.2%
Disgusted 7.2%
Angry 9.7%
Calm 31.2%
Happy 4.4%
Confused 4.7%

AWS Rekognition

Age 26-43
Gender Female, 50.1%
Disgusted 45.7%
Confused 45.2%
Surprised 46.1%
Calm 47.7%
Happy 46%
Angry 45.8%
Sad 48.4%

AWS Rekognition

Age 26-43
Gender Male, 54.9%
Happy 2%
Disgusted 1.4%
Sad 11.6%
Surprised 7.1%
Angry 3.2%
Calm 72.7%
Confused 1.9%

AWS Rekognition

Age 11-18
Gender Female, 54.6%
Sad 47.1%
Confused 46.2%
Disgusted 45.8%
Surprised 46.7%
Angry 46%
Happy 45.9%
Calm 47.4%

Feature analysis

Amazon

Person 99.1%
Chair 91.7%
Painting 70.8%

Captions

Microsoft

a vintage photo of a person 91%
an old photo of a person 90.9%
a vintage photo of a person holding a book 58.1%