Human Generated Data

Title

Untitled (three unidentifed children, girl standing, smallest boy seated on footstool, boy in foreground with toy wheelbarrow)

Date

1880-1900

People

Artist: A. N. Camp, American active 1880s-1900s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3659

Human Generated Data

Title

Untitled (three unidentifed children, girl standing, smallest boy seated on footstool, boy in foreground with toy wheelbarrow)

People

Artist: A. N. Camp, American active 1880s-1900s

Date

1880-1900

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Person 99.3
Human 99.3
Person 98.7
Person 88.8
Text 87.7
Leisure Activities 84.8
Musical Instrument 74.4
Musician 71.6
Advertisement 68.6
Apparel 67.9
Clothing 67.9
Poster 67
Paper 58.4
Lute 58.1
Photography 55.9
Photo 55.9

Imagga
created on 2022-01-30

paper 59.7
grunge 53.7
old 51.6
vintage 50.5
book 49.6
antique 43.4
notebook 42.6
texture 41.7
aged 38.9
retro 36.9
blank 36
snapshot 36
empty 35.2
ancient 33.8
frame 33.4
page 32.5
parchment 27.9
document 26
border 25.3
art 24.7
wallpaper 23.8
brown 23.6
stained 22.1
design 22
damaged 22
grungy 21.8
dirty 21.7
aging 21.1
product 20.8
journal 20.5
canvas 19.9
material 19.7
creation 18.3
worn 18.2
manuscript 17.6
textured 17.5
note 17.5
text 17.5
sheet 16.9
pages 16.6
pattern 16.4
rough 16.4
cardboard 16.3
old fashioned 16.2
decorative 15.9
letter 15.6
yellow 15.2
card 15.1
backgrounds 14.6
decay 14.5
history 14.3
historic 13.8
wall 13.7
grime 13.7
crumpled 13.6
envelope 13.5
artistic 13
cover 13
grain 12.9
stains 12.7
torn 12.6
stain 12.5
spot 12.5
rusty 12.4
burnt 11.7
memory 11.6
age 11.4
open 10.8
ragged 10.7
backdrop 10.7
fracture 10.7
weathered 10.5
textures 10.4
black 10.2
graphic 10.2
space 10.1
message 10.1
container 9.9
photograph 9.6
used 9.6
dirt 9.6
color 9.5
style 8.9
shabby 8.8
diary 8.8
tracery 8.8
mottled 8.8
reminder 8.7
tattered 7.9
broad 7.9
scratched 7.8
crease 7.8
structure 7.8
faded 7.8
succulent 7.8
film 7.7
fiber 7.7
obsolete 7.7
floral 7.7
element 7.4
object 7.3
decoration 7.2
creative 7.1

Google
created on 2022-01-30

Microsoft
created on 2022-01-30

text 97.3
person 95.7
clothing 90.6
drawing 81.7
cartoon 79.3
old 65.7
sketch 63.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-7
Gender Female, 99.9%
Calm 47.3%
Sad 35.4%
Angry 11.7%
Fear 1.9%
Happy 1.5%
Disgusted 0.8%
Surprised 0.7%
Confused 0.6%

AWS Rekognition

Age 4-12
Gender Female, 99.3%
Angry 67.6%
Sad 14.8%
Calm 14.5%
Happy 1%
Fear 0.8%
Confused 0.6%
Surprised 0.5%
Disgusted 0.3%

AWS Rekognition

Age 1-7
Gender Male, 78.7%
Calm 88.2%
Angry 8.2%
Sad 2.1%
Surprised 0.4%
Confused 0.4%
Happy 0.3%
Disgusted 0.3%
Fear 0.2%

Microsoft Cognitive Services

Age 4
Gender Male

Microsoft Cognitive Services

Age 12
Gender Female

Microsoft Cognitive Services

Age 4
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a vintage photo of a person 61.7%
a vintage photo of a person 57.4%
a vintage photo of a laptop 51.4%

Text analysis

Amazon

207
ST.
MAIN
2.2002.3659
NEW
A.N.Camp
JAMESTOWN,
NEW YORKS
YORKS

Google

207
(MAIN
AMamp JAMESTOWN 207 (MAIN ("ST. VEW YORKS 2.2008.3859
JAMESTOWN
YORKS
AMamp
("ST.
VEW
2.2008.3859