Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

People
Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.76

Human Generated Data

Title

Photo Album

People
Date

c. 1857 - c. 1874

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Clothing 97.2
Apparel 97.2
Person 96.8
Human 96.8
Sitting 92.5
Painting 69.2
Art 69.2
Face 68.2
Portrait 66.7
Photography 66.7
Photo 66.7
Furniture 64.8
Robe 62.3
Fashion 62.3
Gown 57.6
Musician 56
Musical Instrument 56

Imagga
created on 2022-01-09

lampshade 23.1
shade 19
black 18.7
container 14.4
protective covering 14.1
old 12.5
covering 12.4
tray 12
musical instrument 10.5
man 10.1
people 10
art 9.7
receptacle 9.7
dark 9.2
silhouette 9.1
equipment 9.1
adult 9
symbol 8.7
antique 8.6
male 8.5
vintage 8.4
call 8.3
happy 8.1
computer 8.1
sexy 8
light 8
body 8
design 8
face 7.8
portrait 7.8
bag 7.8
attractive 7.7
retro 7.4
device 7.3
religion 7.2
night 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

clothing 93
person 87.3
text 83.3
woman 69
old 67.8
human face 52.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-26
Gender Female, 100%
Calm 83.1%
Sad 9.7%
Confused 1.8%
Disgusted 1.7%
Angry 1.2%
Happy 1.1%
Surprised 0.9%
Fear 0.6%

Microsoft Cognitive Services

Age 31
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.8%
Painting 69.2%

Captions

Microsoft

a vintage photo of a person 86.9%
a vintage photo of a person sitting on a table 81.7%
a vintage photo of a person sitting on a bus 41.9%