Human Generated Data

Title

Untitled (unidentified woman, wearing sari, seated cross legged on floor holding object over bowl, kettle to her left)

Date

1860-1899

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.329.2

Human Generated Data

Title

Untitled (unidentified woman, wearing sari, seated cross legged on floor holding object over bowl, kettle to her left)

People

Artist: Unidentified Artist,

Date

1860-1899

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.329.2

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99
Human 99
Nature 96.8
Outdoors 96.3
Clothing 88.5
Apparel 88.5
Home Decor 86.9
Countryside 85.4
Brick 84.4
Wood 82.4
Rural 72.7
Linen 71.9
Face 71.8
Shelter 68.7
Building 68.7
Painting 67.3
Art 67.3
Portrait 62.1
Photography 62.1
Photo 62.1
Hut 61.3
Sitting 60.8
Shack 59.1
Plywood 57.6
Vehicle 56
Transportation 56

Clarifai
created on 2023-10-28

portrait 99.8
people 99.4
art 98.6
vintage 97.8
old 96.4
adult 96.4
wear 96
one 95.5
street 94.6
documentary 94.4
girl 94.1
woman 93.9
sepia pigment 91.6
sepia 90.6
monochrome 89.3
retro 89.2
analogue 87.5
model 86.3
man 85.1
furniture 83.9

Imagga
created on 2022-02-25

old 36.9
ancient 36.3
sculpture 29.6
antique 27.8
vintage 25.6
statue 25.2
grunge 23.8
art 23.8
architecture 22.8
monument 20.6
aged 19.9
retro 19.7
wall 18.9
building 18.6
structure 17.2
stone 17.2
culture 17.1
history 17
historic 16.5
texture 16
decoration 15
tourism 14.9
memorial 14.8
landmark 14.4
religion 14.3
old fashioned 14.3
historical 14.1
travel 14.1
detail 13.7
city 13.3
marble 13.2
empty 12.9
paper 12.6
famous 12.1
dirty 11.8
frame 11.7
damaged 11.4
design 11.4
religious 11.2
border 10.9
sepia 10.7
decay 10.6
newspaper 10.4
carving 10.3
blank 10.3
parchment 9.6
god 9.6
product 9.6
material 8.9
creation 8.8
crack 8.7
obsolete 8.6
rusty 8.6
weathered 8.6
face 8.5
house 8.4
decorative 8.4
traditional 8.3
grain 8.3
pattern 8.2
world 8.2
container 8.2
tourist 8.2
brown 8.1
fountain 8
artistic 7.8
burnt 7.8
crumpled 7.8
figure 7.6
grungy 7.6
cadaver 7.4
symbol 7.4
upright 7.4
column 7.3

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

clothing 98.5
person 98.5
text 97.3
human face 95.1
old 86.4
white 62.4
man 60.6
picture frame 16

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-24
Gender Male, 61.6%
Calm 58.8%
Confused 23.4%
Angry 14.3%
Surprised 1.1%
Disgusted 0.8%
Fear 0.7%
Sad 0.6%
Happy 0.3%

Microsoft Cognitive Services

Age 28
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Painting
Person 99%
Painting 67.3%

Categories

Imagga

paintings art 94.1%
food drinks 5.1%