Human Generated Data

Title

Untitled (woman with girl holding doll, outside)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17538

Human Generated Data

Title

Untitled (woman with girl holding doll, outside)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Nature 99
Person 98.7
Human 98.7
Outdoors 98.4
Apparel 97.4
Clothing 97.4
Person 95.7
Water 94.9
Female 93.4
River 93.1
Child 86.4
Kid 86.4
Play 86
Plant 85.1
Vegetation 85.1
Girl 81.5
Rock 81.2
Tree 79.8
Face 78.8
Waterfall 78.5
Dress 76.2
Woman 69.3
Shoe 68.1
Footwear 68.1
Teen 67.2
People 62.6
Photography 61.2
Portrait 61.2
Photo 61.2
Path 58.6
Baby 58.1
Grass 56.2
Shoe 51.4

Imagga
created on 2022-02-26

runner 30.3
person 30.1
child 30
athlete 28.7
world 23.5
people 23.4
adult 22
portrait 21.4
man 21.3
contestant 19.9
outdoor 16.8
lifestyle 16.6
silhouette 16.6
outdoors 16.5
male 15.6
joy 15
sunset 14.4
fashion 13.6
hair 13.5
women 13.5
love 13.4
park 13.2
happy 13.2
sexy 12.9
black 12.7
attractive 12.6
happiness 12.5
model 12.4
sibling 12.4
walking 12.3
autumn 12.3
human 12
fun 12
dress 11.8
active 11.7
pretty 11.2
beach 11
dark 10.9
leisure 10.8
posing 10.7
lady 10.6
couple 10.5
one 10.5
body 10.4
sky 10.2
dirty 9.9
summer 9.7
play 9.5
sport 9.4
outside 9.4
youth 9.4
life 9.2
city 9.2
blond 9.1
danger 9.1
holding 9.1
style 8.9
looking 8.8
hands 8.7
men 8.6
walk 8.6
hand 8.4
clothing 8.4
action 8.4
color 8.3
ocean 8.3
mask 8.2
cute 7.9
together 7.9
umbrella 7.8
face 7.8
wall 7.7
casual 7.6
clouds 7.6
vacation 7.4
water 7.3
freedom 7.3
alone 7.3
mother 7.3
teenager 7.3
protection 7.3
gorgeous 7.3
sun 7.3
smiling 7.2
romantic 7.1
family 7.1
day 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Male, 64.7%
Happy 68.6%
Angry 12.8%
Calm 5.2%
Sad 4.9%
Surprised 3.3%
Disgusted 2%
Confused 1.7%
Fear 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Shoe 68.1%

Captions

Microsoft

a person standing in a field 72.4%
a person is standing in the grass 70.9%
a person standing in the grass 70.8%

Text analysis

Amazon

АO