Human Generated Data

Title

Untitled (children playing games at party, outside)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17612

Human Generated Data

Title

Untitled (children playing games at party, outside)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17612

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.5
Apparel 99.5
Person 99.4
Human 99.4
Person 98.9
Dress 98.4
Person 98.4
Person 97.4
Person 95.2
Female 89.5
Plant 85.2
Face 82.5
Shorts 81.3
Kid 80.7
Child 80.7
Play 80.3
Grass 78.4
Tree 78.4
Outdoors 77.7
Girl 76.6
People 72.7
Shoe 71.7
Footwear 71.7
Yard 68.1
Nature 68.1
Portrait 67.5
Photography 67.5
Photo 67.5
Skirt 65.3
Woman 63.1
Chair 59.9
Furniture 59.9
Pants 56

Clarifai
created on 2023-10-29

people 100
child 99.9
group 98
group together 97.7
family 94.6
three 93.6
two 93.1
wear 92.1
adult 92
four 90.8
recreation 90.7
home 90.5
boy 90.3
several 88.8
interaction 87.7
woman 86.4
offspring 86.1
man 85.8
sibling 84.5
son 83.4

Imagga
created on 2022-02-26

rake 31
tool 28.3
tent 21.6
mountain tent 20.9
man 20.8
person 18.6
sky 17.9
landscape 17.8
people 16.7
outdoors 16.5
shelter 16.4
beach 15.6
musical instrument 15.5
sand 15.4
water 15.3
outdoor 14.5
holiday 14.3
bench 14.3
male 14.2
adult 13.6
travel 13.4
canvas tent 13.3
summer 12.9
lake 12.8
accordion 12.8
structure 12.8
danger 12.7
sea 12.5
outside 12
ocean 11.6
tree 11.5
park 11.5
vacation 11.5
sun 11.3
protection 10.9
forest 10.4
keyboard instrument 10.3
dirty 9.9
love 9.5
sitting 9.4
sunset 9
destruction 8.8
wind instrument 8.6
adventure 8.5
shovel 8.5
park bench 8.2
religion 8.1
building 8
parasol 8
business 7.9
couple 7.8
accident 7.8
toxic 7.8
protective 7.8
scene 7.8
lonely 7.7
mask 7.7
relax 7.6
walking 7.6
clothing 7.5
field 7.5
snow 7.5
dark 7.5
tourism 7.4
chair 7.4
peaceful 7.3
freedom 7.3
alone 7.3
industrial 7.3
child 7.3
lifestyle 7.2
seat 7.2
grass 7.1
trees 7.1
day 7.1
scenic 7
season 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 93.1
text 92.2
person 88.9
clothing 82.5
dress 73
footwear 53.7
old 52.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Female, 87.9%
Calm 49.1%
Sad 31.7%
Happy 13%
Disgusted 1.6%
Angry 1.5%
Surprised 1.1%
Confused 1.1%
Fear 0.9%

AWS Rekognition

Age 26-36
Gender Male, 85.6%
Calm 94.3%
Happy 4.2%
Sad 0.8%
Confused 0.2%
Fear 0.2%
Surprised 0.2%
Disgusted 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.4%
Person 98.9%
Person 98.4%
Person 97.4%
Person 95.2%
Shoe 71.7%

Categories

Captions

Microsoft
created on 2022-02-26

a vintage photo of a girl 72%
a vintage photo of a person 71.9%
a vintage photo of a boy 52.5%