Human Generated Data

Title

19th Century Male

Date

1844-1857

People

Artist: Toppan Carpenter & Co., American

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dorothy Rackemann, daughter of Francis Rackemann, class of 1909, M.D. 1912, M21564

Human Generated Data

Title

19th Century Male

People

Artist: Toppan Carpenter & Co., American

Date

1844-1857

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2022-03-12

Person 95.6
Human 95.6
Art 69.2
Hole 62.7
Text 61.6
Dish 58.1
Food 58.1
Meal 58.1

Imagga
created on 2022-03-12

wall clock 100
clock 100
timepiece 85.4
measuring instrument 57.2
old 51.6
vintage 48.8
grunge 46.9
antique 46.8
aged 43.5
paper 42.4
texture 41.7
dirty 35.3
wallpaper 33.7
ancient 32
parchment 31.7
empty 30.1
worn 28.7
instrument 28
retro 27.9
document 26.9
cardboard 26.9
material 26.8
wall 25.7
rough 25.5
page 24.1
blank 23.2
decay 22.2
damaged 22
canvas 20.9
stain 19.2
aging 19.2
grungy 19
manuscript 18.6
pattern 18.5
crumpled 18.5
textured 18.4
brown 17.7
text 17.5
book 17.4
artistic 17.4
art 16.9
decorative 16.7
grain 16.6
design 15.8
grime 15.6
letter 15.6
space 15.5
stained 15.4
rusty 15.2
old fashioned 15.2
textures 15.2
surface 15
stains 14.6
weathered 14.3
push button 14.1
sheet 14.1
cover 13.9
backgrounds 13.8
distressed 13.8
fracture 13.6
border 13.6
historic 12.8
shabby 12.8
ragged 12.7
torn 12.6
rust 12.5
scratched 11.8
tattered 10.8
pages 10.7
fiber 10.6
obsolete 10.5
detail 10.5
frame 10
paint 10
sign 9.8
age 9.5
ornament 9.5
yellow 9.3
backdrop 9.1
abrasion 8.9
pasteboard 8.9
uneven 8.8
burnt 8.8
messy 8.7
spotted 8.7
architecture 8.6
card 8.5
symbol 8.1
stucco 8
structure 7.8
color 7.8
faded 7.8
beige 7.7
element 7.4
metal 7.2

Google
created on 2022-03-12

Brown 98
Rectangle 86.4
Tints and shades 76.9
Wood 69.5
Art 67
Visual arts 66.8
Paper 66.2
History 65.5
Paper product 65.2
Circle 61.5
Font 61.3
Collar 60
Room 53.5
Square 52.3
Suit 52
Facial hair 51.1
Portrait 50.9

Microsoft
created on 2022-03-12

wall 95.7
indoor 95.6
human face 93.4
drawing 75.9
white 75.4
person 63.6
dirty 26.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 30-40
Gender Male, 100%
Calm 96.9%
Happy 0.9%
Angry 0.7%
Surprised 0.5%
Sad 0.4%
Fear 0.3%
Disgusted 0.2%
Confused 0.2%

Microsoft Cognitive Services

Age 44
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.6%

Captions

Microsoft

a hole in the ground 53.5%
a close up of a hole in the ground 53.4%
a hole in the wall 50.9%