Human Generated Data

Title

Untitled (uidentified woman, wearing sari, crouched in front of stone, with rolling pin)

Date

1860-1899

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.329.3

Human Generated Data

Title

Untitled (uidentified woman, wearing sari, crouched in front of stone, with rolling pin)

People

Artist: Unidentified Artist,

Date

1860-1899

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Human 98.9
Person 98.9
Crypt 65.5
Apparel 63.9
Clothing 63.9
Dungeon 58.8
Art 58.6
Drawing 58.6
Sketch 58.6
Female 55.8

Imagga
created on 2022-02-25

vintage 53.7
old 50.9
paper 48.8
grunge 44.3
book jacket 43
antique 41.7
texture 40.3
retro 35.3
jacket 34.5
ancient 32
empty 30.1
aged 29.9
dirty 29.9
document 26.9
wrapping 26.4
wallpaper 25.3
letter 24.8
book 24.7
binding 24.7
rough 24.6
page 24.2
parchment 24
art 23.1
cardboard 23.1
blank 22.3
container 22.2
damaged 21
material 20.6
aging 20.2
covering 20.1
backgrounds 19.5
brown 19.2
canvas 19
sheet 18.8
pattern 18.5
grungy 18
wall 17.1
cover 16.7
design 16.6
stained 16.4
crate 16.2
history 16.1
frame 16
worn 15.3
envelope 15.1
manuscript 14.7
pages 14.7
border 14.5
box 14.3
decorative 14.2
historic 13.8
crumpled 13.6
ragged 12.7
stains 12.7
torn 12.6
decay 12.6
burnt 11.7
old fashioned 11.4
textured 11.4
faded 10.7
board 10.7
surface 10.6
poster 10.4
stamp 10.2
text 9.6
mail 9.6
museum 9.2
postal 8.8
grime 8.8
fracture 8.8
symbol 8.8
rustic 8.6
obsolete 8.6
post 8.6
packet 8.5
wood 8.3
grain 8.3
note 8.3
detail 8.1
yellow 8
structure 7.9
postage 7.9
shabby 7.9
correspondence 7.8
artistic 7.8
color 7.8
sepia 7.8
stain 7.7
spot 7.7
rusty 7.6
weathered 7.6
package 7.5
backdrop 7.4
style 7.4
decoration 7.1
wooden 7

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

clothing 99.1
person 98.9
human face 98.8
old 92.6
text 92.1
woman 86.7
smile 79.9
black 76.8
white 68.6
girl 68.1
vintage 29.2
picture frame 14
stone 6

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 99.3%
Angry 62%
Calm 14.2%
Sad 10.4%
Surprised 3.9%
Confused 3.7%
Fear 2.9%
Disgusted 2.2%
Happy 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a vintage photo of a woman 85.7%
a vintage photo of a woman sitting in a box 73.1%
an old photo of a woman 73%