Human Generated Data

Title

Untitled (children in three-legged race)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17613

Human Generated Data

Title

Untitled (children in three-legged race)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17613

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Shorts 99.9
Clothing 99.9
Apparel 99.9
Person 99.6
Human 99.6
Person 99.6
Person 99.4
Person 99.4
Person 99.3
Person 99.3
Person 99.3
Person 97.9
Grass 93.4
Plant 93.4
Female 89.6
Tree 86.4
People 85.3
Dress 84.5
Outdoors 84.1
Vegetation 81.8
Kid 80.2
Child 80.2
Skirt 73
Girl 68
Face 67.6
Woman 67.1
Park 63.6
Lawn 63.6
Nature 62.7
Play 59.7
Yard 56.7
Photography 55.5
Photo 55.5

Clarifai
created on 2023-10-29

people 99.9
child 99.2
group together 98.9
adult 98
group 97.5
wear 95.1
man 94.5
boy 93.7
many 92.8
monochrome 92.1
recreation 91
administration 90.9
woman 90.7
outfit 88.8
several 87.2
family 85.2
canine 85
nostalgia 84.8
war 83.5
uniform 83.4

Imagga
created on 2022-02-26

kin 56.7
man 25.5
people 25.1
sunset 22.5
male 19.9
person 19.7
beach 18.8
silhouette 18.2
outdoor 16.8
summer 16.7
outdoors 15.7
adult 15.6
water 14.7
child 14.5
sport 14.3
couple 13.9
love 13.4
family 13.3
vacation 13.1
lifestyle 13
portrait 12.9
sky 12.8
ocean 12.4
sun 12.1
dark 11.7
leisure 11.6
boy 11.3
sibling 11.1
world 11
relax 10.9
attractive 10.5
athlete 10.5
crutch 10.4
happiness 10.2
sea 10.2
happy 10
sexy 9.6
sand 9.6
outside 9.4
evening 9.3
two 9.3
old 9.1
dirty 9
black 9
fun 9
cricket equipment 8.8
sports equipment 8.8
together 8.8
run 8.7
cricket bat 8.6
dusk 8.6
walking 8.5
field 8.4
parent 8.4
father 8.3
vintage 8.3
holding 8.3
recreation 8.1
staff 8
sepia 7.8
serenity 7.8
fashion 7.5
relationship 7.5
shore 7.4
action 7.4
park 7.4
peaceful 7.3
lady 7.3
alone 7.3
life 7.3
danger 7.3
body 7.2
bench 7.2
women 7.1
kid 7.1
autumn 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 93.7
clothing 93.5
footwear 91.2
person 90.6
text 79.7
black 72.9
posing 55.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 4-10
Gender Female, 97.3%
Happy 59%
Calm 29.3%
Surprised 4.2%
Sad 3.6%
Angry 2%
Disgusted 0.8%
Confused 0.7%
Fear 0.4%

AWS Rekognition

Age 29-39
Gender Female, 80%
Calm 56.5%
Surprised 23.7%
Sad 10.4%
Happy 5.8%
Fear 1.2%
Disgusted 1.2%
Angry 0.7%
Confused 0.5%

AWS Rekognition

Age 23-33
Gender Female, 99.1%
Happy 64.9%
Sad 10.3%
Calm 9.8%
Angry 5.8%
Fear 4.2%
Surprised 3.3%
Disgusted 1.3%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person
Person 99.6%
Person 99.6%
Person 99.4%
Person 99.4%
Person 99.3%
Person 99.3%
Person 99.3%
Person 97.9%