Human Generated Data

Title

Untitled (two girls in matching dresses, standing outside)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17306

Human Generated Data

Title

Untitled (two girls in matching dresses, standing outside)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17306

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Dress 99.9
Clothing 99.9
Apparel 99.9
Person 99.4
Human 99.4
Person 98.9
Female 97.8
Woman 87
Shoe 85.4
Footwear 85.4
Girl 82.9
People 78
Face 75.9
Shoe 74
Outdoors 69.1
Blonde 68
Teen 68
Kid 68
Child 68
Portrait 63.4
Photography 63.4
Photo 63.4
Nature 57.9
Shoe 53.3

Clarifai
created on 2023-10-29

people 99.9
portrait 99
monochrome 98.8
wear 98.2
two 97.5
adult 97
child 96.6
group 95
three 94.8
woman 94.2
dress 92.6
man 92.4
street 91.3
retro 89.4
girl 87.9
four 84.2
leader 83.8
son 83.2
one 82.4
black and white 82.4

Imagga
created on 2022-02-26

people 24
dress 23.5
person 20.6
fashion 18.8
man 18.3
portrait 17.5
clothing 17.1
adult 16.4
kin 14.7
pretty 14
happy 13.1
outfit 12.7
women 12.6
attractive 12.6
old 12.5
happiness 11.7
cute 11.5
smile 11.4
lady 11.4
male 11.3
child 11.1
style 11.1
holiday 10.7
face 10.6
model 10.1
holding 9.9
cheerful 9.7
hat 9.7
bag 9.7
standing 9.6
clothes 9.4
costume 9.2
posing 8.9
couple 8.7
black 8.6
walking 8.5
outdoor 8.4
street 8.3
one 8.2
girls 8.2
lifestyle 7.9
culture 7.7
two 7.6
elegance 7.6
human 7.5
traditional 7.5
bride 7.5
joyful 7.3
protection 7.3
danger 7.3
smiling 7.2
celebration 7.2
history 7.1
hair 7.1
interior 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 98.7
tree 98.1
dress 97.8
clothing 96
text 95
person 88.7
posing 87.3
footwear 82.2
sport 81.9
standing 80.1
skirt 76.8
woman 73.1
black 72.5
smile 66.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Male, 94%
Calm 51.1%
Happy 42%
Fear 2.6%
Sad 1.3%
Disgusted 1%
Confused 0.9%
Angry 0.6%
Surprised 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.4%
Person 98.9%
Shoe 85.4%
Shoe 74%
Shoe 53.3%

Categories

Text analysis

Amazon

10
na

Google

MIL3
MIL3