Human Generated Data

Title

Untitled (two girls playing croquet)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17305

Human Generated Data

Title

Untitled (two girls playing croquet)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 99.8
Person 99.8
Person 99.8
Clothing 99.4
Apparel 99.4
Shorts 99.3
Shoe 96.4
Footwear 96.4
Shoe 93.2
Ground 83.4
Female 83
Grass 82.9
Plant 82.9
Vegetation 77.6
Dress 75
Cricket 72.8
Sports 72.8
Sport 72.8
Tree 71.9
People 71.2
Photo 69.7
Face 69.7
Portrait 69.7
Photography 69.7
Outdoors 66
Field 64.5
Woman 64.5
Soil 63.9
Skin 59.7
Standing 57.7
Croquet 55.4

Imagga
created on 2022-02-26

staff 100
stick 100
crutch 100
man 32.9
sunset 26.1
beach 25.3
male 22
walking 21.8
people 21.2
silhouette 20.7
water 19.4
person 18.4
sport 18.2
adult 17.5
couple 17.4
sky 17.2
leisure 16.6
outdoors 16.4
active 16.2
sun 16.1
outdoor 16.1
sand 15.7
sea 14.9
summer 14.8
vacation 14.7
ocean 14.1
romance 13.4
love 12.6
senior 12.2
relax 11.8
dusk 11.4
walk 11.4
travel 11.3
outside 11.1
lifestyle 10.9
grass 10.3
happy 10
recreation 9.9
mountain 9.8
landscape 9.7
hobby 9.5
men 9.5
life 9.4
evening 9.3
exercise 9.1
health 9
fun 9
romantic 8.9
hiking 8.7
golf 8.6
two 8.5
sunrise 8.4
action 8.4
park 8.2
happiness 7.8
sunny 7.8
attractive 7.7
retirement 7.7
husband 7.6
shore 7.4
playing 7.3
fitness 7.2
body 7.2
coast 7.2
activity 7.2

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 99.7
tree 99.7
ground 98.1
standing 90.3
text 89.3
black and white 88.5
golf 87.3
person 59.6
man 59

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 95.2%
Calm 56.8%
Happy 24.3%
Fear 9.9%
Disgusted 2.7%
Surprised 2.6%
Sad 1.7%
Angry 1.5%
Confused 0.6%

AWS Rekognition

Age 23-31
Gender Female, 86.4%
Calm 97.4%
Happy 1.4%
Sad 0.5%
Fear 0.2%
Surprised 0.2%
Disgusted 0.1%
Confused 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 96.4%

Captions

Microsoft

a person standing on top of a dirt field 90.5%
a person standing on a dirt road 90.4%
a person standing on a dirt road 90.3%

Text analysis

Amazon

12
٢٤٢
KODAK-2.M

Google

12
KODVK-2
12 KODVK-2