Human Generated Data

Title

Untitled (woman standing, holding umbrella)

Date

c. 1870-c. 1890

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.367.48

Human Generated Data

Title

Untitled (woman standing, holding umbrella)

People

Artist: Unidentified Artist,

Date

c. 1870-c. 1890

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Human 91.1
Person 91.1
Advertisement 87.5
Poster 86.7
Clothing 85.6
Apparel 85.6
Art 85.1
Person 76.1
Collage 72.6
Paper 59.3
Brochure 58.2
Flyer 58.2
Painting 57.5
Person 43.2

Clarifai
created on 2018-03-16

people 98.7
illustration 97.2
adult 96.8
print 96.6
painting 96.4
no person 95
wear 94.9
art 93.8
man 93.5
retro 93.4
woman 93
paper 92.3
indoors 91.2
group 91.1
vintage 90
antique 89.7
old 89.5
one 87.5
picture frame 87.5
text 86.9

Imagga
created on 2018-03-16

envelope 100
container 96.1
grunge 50.3
paper 50.2
vintage 48.9
old 46.7
antique 42.5
texture 41
retro 41
aged 39.9
ancient 31.2
frame 30.8
blank 25.7
design 25.3
page 25.1
material 25
floral 23.8
parchment 23.1
wallpaper 22.2
art 22.1
damaged 22
border 21.7
empty 20.6
flower 20
crumpled 19.4
decay 19.3
brown 19.2
notebook 19
dirty 19
grime 17.6
document 16.7
decorative 16.7
fracture 16.6
stains 16.5
letter 16.5
grain 15.7
card 15.5
backgrounds 15.4
textures 15.2
pattern 15.1
element 14.9
structure 14.8
historic 14.7
album 14.6
graphic 14.6
old fashioned 14.3
canvas 14.2
textured 14
artwork 13.7
mottled 13.7
cardboard 13.5
sheet 13.2
style 12.6
leaf 12.5
worn 12.4
space 12.4
backdrop 12.4
text 12.2
manuscript 11.8
decoration 11.7
ragged 11.7
faded 11.7
grungy 11.4
book 11.4
artistic 11.3
rough 10.9
torn 10.7
stained 10.6
stain 10.6
cash 10.1
color 10
paint 10
broad 9.8
tracery 9.8
succulent 9.7
scrapbook 9.7
spot 9.6
wall 9.4
currency 9
detail 8.9
shabby 8.8
surface 8.8
burst 8.8
crack 8.7
your 8.7
ornament 8.6
rusty 8.6
money 8.5
stamp 8.4
note 8.3
burnt 7.8
drawing 7.7
detailed 7.7
fiber 7.7
aging 7.7
mail 7.7
age 7.6
greeting 7.4
message 7.3
collection 7.2
day 7.1

Google
created on 2018-03-16

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 26-43
Gender Female, 51.9%
Happy 45.1%
Disgusted 45.1%
Surprised 45.1%
Calm 45.5%
Angry 46.5%
Sad 52.7%
Confused 45.1%

AWS Rekognition

Age 16-27
Gender Female, 50.2%
Angry 49.6%
Calm 49.7%
Happy 49.6%
Disgusted 49.6%
Surprised 49.6%
Sad 50%
Confused 49.5%

AWS Rekognition

Age 45-65
Gender Female, 50.5%
Angry 49.5%
Sad 50.1%
Calm 49.7%
Disgusted 49.5%
Happy 49.6%
Surprised 49.5%
Confused 49.5%

AWS Rekognition

Age 10-15
Gender Male, 50.3%
Happy 49.5%
Surprised 49.6%
Disgusted 49.6%
Confused 49.7%
Angry 49.6%
Calm 49.7%
Sad 49.8%

AWS Rekognition

Age 26-43
Gender Male, 53.2%
Confused 45.1%
Surprised 45.1%
Angry 45.3%
Sad 54.1%
Happy 45.1%
Calm 45.2%
Disgusted 45.1%

Microsoft Cognitive Services

Age 27
Gender Female

Feature analysis

Amazon

Person 91.1%

Captions

Microsoft

a photo of a person 25.8%
a person taking a selfie 8.6%
a picture of a person 8.5%

Text analysis

Amazon

p
Pes
P195231-20
Ferssehaer