Human Generated Data

Title

Untitled (portrait of woman looking over man reading paper)

Date

c.1930

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4330

Human Generated Data

Title

Untitled (portrait of woman looking over man reading paper)

People

Artist: Durette Studio, American 20th century

Date

c.1930

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 97.2
Clothing 95.9
Apparel 95.9
Person 94.3
Person 85.2
Female 84.8
Person 74
People 68.7
Worker 67.7
Girl 66.9
Photography 64.9
Photo 64.9
Woman 64.5
Face 63
Drawing 60
Art 60
Hairdresser 59.7
Dress 59.5
Fashion 57
Overcoat 56.2
Suit 56.2
Coat 56.2

Clarifai
created on 2019-06-01

people 99.9
two 98.6
adult 97.8
man 95.8
veil 95.5
woman 95
wear 95
print 94.3
art 89.9
group 89.7
leader 89.2
administration 88.4
one 87.6
illustration 86.5
three 86.4
portrait 85.1
sit 83.5
furniture 83.2
wedding 81.2
retro 79.8

Imagga
created on 2019-06-01

negative 27.6
film 23.3
sketch 17.9
drawing 17.8
dress 16.3
portrait 16.2
old 16
art 15.8
black 15.2
photographic paper 15
bride 14.4
people 13.9
clothing 13.1
book jacket 13
grunge 12.8
elegance 12.6
vintage 12.4
jacket 12.2
ancient 12.1
person 12
statue 11.8
man 11.4
body 11.2
representation 11
covering 11
posing 10.7
fashion 10.6
adult 10.5
attractive 10.5
window 10.3
wedding 10.1
model 10.1
photographic equipment 10
retro 9.8
bridal 9.7
face 9.2
pretty 9.1
human 9
lady 8.9
sculpture 8.7
antique 8.7
bouquet 8.5
gown 8.4
one 8.2
sensuality 8.2
symbol 8.1
sexy 8
design 7.9
flowers 7.8
sepia 7.8
decoration 7.8
party 7.7
wrapping 7.7
head 7.6
silhouette 7.4
church 7.4
alone 7.3
dirty 7.2
male 7.2
cute 7.2
religion 7.2
hair 7.1
love 7.1
architecture 7
look 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

sketch 97.2
drawing 95.9
text 92.9
cartoon 88.2
clothing 83.4
wedding dress 80.4
dress 75.1
painting 75
person 74
black and white 62.5
bride 52.4

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 51.3%
Angry 45.7%
Disgusted 45.3%
Confused 45.5%
Calm 47.2%
Sad 46.3%
Surprised 46.3%
Happy 48.7%

Feature analysis

Amazon

Person 94.3%

Captions

Microsoft

a close up of text on a black background 47.5%
a close up of text on a white background 47.4%
close up of text on a black background 44.4%

Text analysis

Amazon

RAM