Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

People
Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.55

Human Generated Data

Title

Photo Album

People
Date

c. 1857 - c. 1874

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Clothing 99.6
Apparel 99.6
Person 98.7
Human 98.7
Hat 93.3
Sun Hat 69.8
Furniture 67.9
Female 55.3

Imagga
created on 2022-01-09

art 15.7
people 15.6
person 14.8
black 13.5
fashion 12.8
dress 11.7
portrait 11.6
old 11.1
wall 11.1
grunge 11.1
adult 11
house 10.9
man 10.7
musical instrument 10.7
sexy 10.4
style 10.4
design 9.8
hat 9.5
color 9.4
clothing 9.4
model 9.3
sketch 9.3
stringed instrument 9
decoration 8.9
home 8.8
hair 8.7
window 8.7
face 8.5
device 8.5
newspaper 8.4
church 8.3
holding 8.2
religion 8.1
male 7.9
smile 7.8
antique 7.8
painter 7.7
child 7.6
human 7.5
one 7.5
vintage 7.4
girls 7.3
paint 7.2
music 7.2
body 7.2
covering 7.2
interior 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

clothing 98.8
person 96.7
text 96.7
old 94.8
window 90.7
fashion accessory 77
human face 75.6
smile 66.3
hat 60.8
cowboy hat 51.9
clothes 19.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 42-50
Gender Male, 99.6%
Sad 48.5%
Calm 34.4%
Confused 12.9%
Happy 1.1%
Angry 0.9%
Disgusted 0.7%
Surprised 0.7%
Fear 0.7%

Microsoft Cognitive Services

Age 44
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Hat 93.3%

Captions

Microsoft

a vintage photo of a person sitting on a bus 38.2%
a vintage photo of a man and a woman sitting on a bus 25.6%
a vintage photo of a person sitting on a bus 25.5%