Human Generated Data

Title

A Norfolk Flower

Date

c. 1888

People

Artist: Peter Henry Emerson, British, English 1856 - 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.325

Human Generated Data

Title

A Norfolk Flower

People

Artist: Peter Henry Emerson, British, English 1856 - 1936

Date

c. 1888

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.325

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Bonnet 100
Clothing 100
Hat 100
Apparel 100
Painting 92.2
Art 92.2
Person 84.2
Human 84.2

Clarifai
created on 2023-10-25

portrait 99.8
people 99.3
art 99.1
lid 98.7
sepia 98.5
adult 98.3
veil 98.3
wear 97.9
retro 97.8
man 97.7
facial hair 97.7
one 97.2
old 97
vintage 96.7
antique 96.7
sepia pigment 95.6
print 95.4
military 94.7
leader 92.4
mustache 91.4

Imagga
created on 2022-01-08

portrait 35
child 31.3
face 31.3
person 27.4
model 25.7
people 25.7
pretty 25.2
cute 25.1
eyes 24.1
adult 23.9
sand 22.5
fashion 21.9
attractive 21.7
hair 20.6
expression 19.6
soil 19.3
sexy 18.5
happy 18.2
smile 17.8
human 17.3
lady 16.2
kid 16
blond 15.6
childhood 15.2
hat 14.6
sketch 14.5
earth 13.6
looking 13.6
studio 12.9
head 12.6
little 12.4
look 12.3
drawing 11.7
smiling 11.6
lifestyle 11.6
makeup 11.1
gorgeous 10.9
towel 10.7
women 10.3
close 10.3
youth 10.2
skin 10.2
hand 9.9
lovely 9.8
one 9.7
closeup 9.4
clothing 9.4
happiness 9.4
black 9
cheerful 8.9
posing 8.9
bride 8.8
representation 8.7
love 8.7
innocence 8.7
seductive 8.6
lips 8.3
adorable 8.3
care 8.2
sensual 8.2
make 8.2
currency 8.1
braid 8.1
baby 7.9
boy 7.8
color 7.8
ethnicity 7.7
modern 7.7
casual 7.6
hairstyle 7.6
joy 7.5
cosmetics 7.5
fun 7.5
style 7.4
economy 7.4
banking 7.4
long 7.3
nice 7.3
girls 7.3
sensuality 7.3
paper 7.2
body 7.2
spa 7.2
religion 7.2

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

human face 98.6
text 95.6
drawing 95.3
person 93.3
sketch 86.2
old 70.5
painting 62.8
clothing 56.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-14
Gender Male, 91.6%
Calm 53.1%
Sad 15.2%
Angry 10.9%
Confused 9.6%
Fear 3.7%
Surprised 3.4%
Disgusted 2.8%
Happy 1.3%

Microsoft Cognitive Services

Age 19
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 92.2%
Person 84.2%

Categories

Imagga

interior objects 61.2%
food drinks 22.7%
paintings art 15.9%

Captions

Microsoft
created on 2022-01-08

a vintage photo of a person 83.4%
an old photo of a person 83.3%
old photo of a person 80%

Text analysis

Amazon

P.H.EMERSON

Google

PHEMERSON
PHEMERSON