Human Generated Data

Title

Untitled (little girl putting on coat)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17101

Human Generated Data

Title

Untitled (little girl putting on coat)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 98.9
Apparel 98.9
Human 98.4
Person 98.4
Dress 94.6
Floor 93.2
Sleeve 88.5
Flooring 87.3
Footwear 86.2
Shoe 86.2
Female 78.3
Face 73.6
Pants 72.4
Photography 68.8
Portrait 68.8
Photo 68.8
Baby 68
Overcoat 66
Coat 66
Door 65.8
Girl 64.2
Kid 63
Child 63
Long Sleeve 61.6
Suit 61.5
Outdoors 60.8
Woman 60.4
Standing 58.7
Indoors 55.6
Drawing 55.2
Art 55.2
Nature 55.1

Imagga
created on 2022-02-26

negative 28.1
person 24.5
film 23.1
people 21.8
adult 19.3
man 18.8
portrait 17.5
photographic paper 17.1
brass 17.1
dress 15.4
human 15
fashion 14.3
wind instrument 14.1
male 13.5
art 13.4
lifestyle 13
cornet 12.5
model 12.4
hair 11.9
posing 11.6
photographic equipment 11.5
musical instrument 11.4
face 11.4
lady 11.4
sexy 11.2
attractive 11.2
pretty 11.2
men 11.2
exercise 10.9
body 10.4
style 10.4
happiness 10.2
girls 10
sport 9.9
costume 9.7
look 9.6
standing 9.6
life 9.2
health 9
active 9
happy 8.8
professional 8.7
mask 8.7
love 8.7
smiling 8.7
casual 8.5
black 8.4
modern 8.4
elegance 8.4
statue 8.3
fun 8.2
marble 8.2
sensuality 8.2
stylish 8.1
fitness 8.1
child 8
teacher 7.9
women 7.9
cute 7.9
clothing 7.7
wall 7.7
bride 7.7
sculpture 7.6
performer 7.5
relaxation 7.5
device 7.5
clothes 7.5
historic 7.3
dancer 7.3
gorgeous 7.3
pose 7.2
smile 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 98.5
toddler 97.6
wall 96.1
clothing 95.3
person 93.5
baby 88.7
indoor 88
black and white 87.6
human face 81.6
girl 77.3
footwear 65.5
child 52.8

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 59.2%
Surprised 94.6%
Calm 4.4%
Happy 0.5%
Fear 0.2%
Disgusted 0.1%
Angry 0.1%
Sad 0.1%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%
Shoe 86.2%

Captions

Microsoft

a person standing in front of a window 71.1%
a person sitting in front of a mirror posing for the camera 56.7%
a person sitting in front of a window 56.6%

Text analysis

Amazon

TERAS-MAGOM