Human Generated Data

Title

Untitled (portrait of a seated Native American man)

Date

1880s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2947

Human Generated Data

Title

Untitled (portrait of a seated Native American man)

People

Artist: Unidentified Artist,

Date

1880s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2947

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.4
Human 99.4
Leisure Activities 63.4
Toy 62
Hula 62
Costume 55.7
Building 55.6

Clarifai
created on 2023-10-25

people 99.3
art 99
sepia pigment 98.6
wear 97.6
retro 96.7
portrait 96.2
man 96.1
two 95.5
vintage 95.1
old 94.8
sepia 94.7
antique 93.9
paper 93.9
illustration 93.3
one 93.3
woman 92.1
adult 91.8
painting 90.2
print 88
parchment 86.4

Imagga
created on 2022-01-09

toilet seat 36.7
seat 29.7
texture 29.2
old 25.8
sand 24.9
furniture 24.1
antique 20.9
ancient 20.7
grunge 20.4
tile 19.6
vintage 16.5
textured 15.8
earth 15.7
aged 15.4
dirty 15.4
pattern 15
furnishing 14.9
travel 14.8
stone 14.7
rough 14.6
history 14.3
brown 14
sculpture 13.8
architecture 13.8
historic 13.7
art 13.2
retro 13.1
beach 12.6
wallpaper 12.2
wall 12
culture 12
paper 11.9
grungy 11.4
design 11.2
temple 11.1
ruler 10.5
worn 10.5
text 10.5
detail 10.5
building 10.4
stucco 9.9
material 9.8
weathered 9.5
tourism 9.1
religion 9
structure 8.7
natural 8.7
blank 8.6
carving 8.6
historical 8.5
desert 8.4
marble 8.3
backgrounds 8.1
surface 7.9
rock 7.8
sea 7.8
statue 7.8
summer 7.7
museum 7.6
document 7.4
gold 7.4
note 7.3
memorial 7.1

Google
created on 2022-01-09

Sleeve 87.2
Hat 83.9
Tree 82.9
Wood 81.6
Beige 78.4
Tints and shades 77.4
Art 77.3
Vintage clothing 76.2
Visual arts 69.5
Rectangle 67.4
History 66.9
Painting 66.8
Beard 64.9
Sitting 63.7
Illustration 63.5
Paper product 62
Room 61.3
Circle 57.3
Portrait 54.6
Facial hair 54.3

Microsoft
created on 2022-01-09

old 98.9
clothing 98.3
wall 95.3
text 93.6
man 93.3
person 87.2
black 85.1
white 80.1
posing 76.2
vintage 73.1
human face 71.4
drawing 62
time 50.2
stone 47.6
building material 26.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 37-45
Gender Male, 95.8%
Angry 71.3%
Confused 16.4%
Calm 11%
Disgusted 0.5%
Surprised 0.3%
Sad 0.2%
Fear 0.2%
Happy 0.1%

Microsoft Cognitive Services

Age 36
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

paintings art 99.9%