Human Generated Data

Title

Untitled (three girls on beach)

Date

1962

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16644

Human Generated Data

Title

Untitled (three girls on beach)

People

Artist: Lucian and Mary Brown, American

Date

1962

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-18

Clothing 100
Apparel 100
Shorts 100
Human 99.8
Person 99.8
Person 99.7
Person 99.5
Person 96.6
Play 87.9
Footwear 84.1
Shoe 84.1
Outdoors 81.9
Water 74.7
Nature 71
Plant 69.1
Child 68.2
Kid 68.2
Person 67.2
Grass 65.7
Face 64.9
People 63.3
Tree 61.2
Soil 58.9
Standing 57.2
Urban 56.9
Back 55.9
Female 55.5

Imagga
created on 2022-02-18

park bench 100
bench 100
seat 78.1
beach 53.5
furniture 48.9
sea 39.9
ocean 36.9
water 31.4
sand 31
vacation 30.3
summer 29.6
sky 25.5
furnishing 24.8
sunset 23.4
sun 21.7
man 21.5
relax 21.1
people 20.6
outdoors 20.2
horizon 19.8
waves 19.5
coast 18.9
railing 18.5
travel 18.3
holiday 17.9
landscape 17.9
couple 17.4
shore 16.8
adult 16.8
tropical 15.3
person 15.1
silhouette 14.9
lifestyle 14.5
love 14.2
together 14
walking 13.3
outdoor 13
male 12.8
two 12.7
clouds 12.7
leisure 12.5
seaside 12.3
coastline 12.2
sunny 12.1
looking 12
outside 12
child 11.7
walk 11.4
evening 11.2
happy 10.7
scenic 10.5
chair 10.4
tourist 10.2
island 10.1
relaxing 10
fun 9.7
boy 9.6
women 9.5
enjoy 9.4
happiness 9.4
sunrise 9.4
scenery 9
sunlight 8.9
family 8.9
hair 8.7
men 8.6
bay 8.5
portrait 8.4
relaxation 8.4
shoreline 8.3
tourism 8.3
lake 8.2
park 8.2
freedom 8.2
alone 8.2
recreation 8.1
romantic 8
father 7.8
run 7.7
winter 7.7
dusk 7.6
sport 7.6
vacations 7.5
sunshine 7.5
rest 7.4
back 7.3
romance 7.1
life 7.1
mountain 7.1
model 7

Google
created on 2022-02-18

Microsoft
created on 2022-02-18

outdoor 99.1
black and white 91.8
clothing 90.5
person 84.1
text 83.5
footwear 73.3
man 69.7
white 65.4
monochrome 54.8

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 69.3%
Calm 68.9%
Happy 14.7%
Surprised 6.1%
Angry 3.1%
Disgusted 2.4%
Sad 2.2%
Confused 1.8%
Fear 0.7%

AWS Rekognition

Age 19-27
Gender Female, 99.1%
Calm 84.7%
Happy 12.4%
Sad 1.4%
Surprised 0.7%
Confused 0.2%
Angry 0.2%
Fear 0.2%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 84.1%

Captions

Microsoft

a group of people standing around a fire hydrant 40.2%
a group of people sitting around a fire hydrant 35.5%
an old photo of a person 35.4%

Text analysis

Amazon

J3
YT33A2-XAQO