Human Generated Data

Title

Untitled (two girls and baby outside)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17012

Human Generated Data

Title

Untitled (two girls and baby outside)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17012

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.5
Human 99.5
Person 99.4
Shorts 99
Clothing 99
Apparel 99
Vegetation 98.5
Plant 98.5
Play 98.1
Tree 97.5
Grass 95.3
Person 94.5
Helmet 92.7
Outdoors 90.6
Woodland 90.3
Forest 90.3
Nature 90.3
Land 90.3
Face 80.2
Bush 79.4
Kid 78.1
Child 78.1
People 75.9
Yard 72.8
Grove 68.7
Female 65.5
Ground 63.9
Baby 62.1
Girl 60.8
Portrait 60.1
Photography 60.1
Photo 60.1
Tree Trunk 58.8

Clarifai
created on 2023-10-28

people 99.7
child 99.4
monochrome 99.1
boy 97.3
girl 97
portrait 94.7
woman 94.4
group 92.9
recreation 91.1
adult 90.9
group together 90.5
man 88.2
street 88
fun 87
walk 86.3
family 86.3
black and white 85.6
son 85.5
sepia 82.9
sibling 81

Imagga
created on 2022-02-26

child 36
people 22.3
world 19.7
man 18.1
portrait 18.1
outdoor 17.6
kin 17.3
statue 16.8
person 16.6
outdoors 15.7
sculpture 15.1
adult 14.2
black 13.8
face 13.5
summer 12.9
happy 12.5
male 12.1
love 11.8
happiness 11.8
sport 11.6
park 11.5
culture 11.1
youth 11.1
travel 10.6
art 10.5
old 10.5
outside 10.3
sky 10.2
joy 10
silhouette 9.9
religion 9.9
pretty 9.8
fun 9.7
boy 9.6
lifestyle 9.4
two 9.3
smile 9.3
leisure 9.1
attractive 9.1
fashion 9
sunset 9
family 8.9
bride 8.8
sepia 8.7
hair 8.7
smiling 8.7
traditional 8.3
vacation 8.2
dress 8.1
kid 8
autumn 7.9
grass 7.9
cute 7.9
stone 7.8
parent 7.8
married 7.7
decoration 7.7
head 7.6
field 7.5
dark 7.5
monument 7.5
vintage 7.4
juvenile 7.4
tourism 7.4
player 7.4
girls 7.3
children 7.3
active 7.2
holiday 7.2

Google
created on 2022-02-26

Plant 90.7
Black 89.5
Smile 86.9
Black-and-white 85.2
Happy 80.3
People in nature 79.4
Adaptation 79.4
Grass 78.8
Monochrome 77.8
Monochrome photography 77.5
Tree 76.8
Vintage clothing 69.4
Photo caption 68.9
Recreation 64.9
Child 64.8
Sitting 61.7
Suit 59.3
Room 57.1
Laugh 57
Running 54.9

Microsoft
created on 2022-02-26

outdoor 99.9
person 99
black and white 74.1
text 72.3
flower 63.1
rooster 18.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 93.7%
Calm 59.4%
Happy 34.4%
Angry 1.4%
Sad 1.3%
Surprised 1.3%
Disgusted 1.1%
Confused 0.7%
Fear 0.3%

AWS Rekognition

Age 6-12
Gender Male, 96.4%
Sad 95.7%
Fear 1.5%
Angry 1%
Calm 1%
Disgusted 0.3%
Happy 0.3%
Confused 0.2%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Helmet
Person 99.5%
Person 99.4%
Person 94.5%
Helmet 92.7%