Human Generated Data

Title

Drought Refugees from Oklahoma Camping by the Roadside, Blythe, California, August 1936

Date

1936

People

Artist: Dorothea Lange, American 1895 - 1965

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.718

Human Generated Data

Title

Drought Refugees from Oklahoma Camping by the Roadside, Blythe, California, August 1936

People

Artist: Dorothea Lange, American 1895 - 1965

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.718

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 99.6
Person 99.6
Person 99.4
Clothing 85.3
Apparel 85.3
Face 83.6
Wood 79
Urban 74.3
Sitting 73.1
People 71.1
Outdoors 68
Building 65.1
Portrait 63.2
Photography 63.2
Photo 63.2
Countryside 62.3
Shelter 62.3
Nature 62.3
Rural 62.3
Porch 60.8
Person 59.6
Furniture 56.1

Clarifai
created on 2023-10-26

people 100
two 99.5
adult 99.3
portrait 97.8
man 95.7
woman 95.4
furniture 95.4
one 94.6
three 94.5
group 94
war 90.9
wear 90.3
offspring 90
child 89.8
son 88.2
group together 87.9
reclining 87.8
campsite 87.5
soldier 86.1
military 83.2

Imagga
created on 2022-01-22

kin 69.3
mother 41.2
family 40.1
child 34.9
parent 34
happy 33.9
father 27.4
people 26.8
happiness 25.9
together 24.5
man 24.2
adult 23.9
brother 23.5
sibling 23.5
male 23
love 22.9
smiling 22.4
sitting 22.3
portrait 22
couple 21.8
dad 21.1
daughter 21
kid 20.4
smile 20
home 17.6
cute 16.5
husband 16.3
baby 16
lifestyle 15.9
boy 15.7
attractive 15.4
childhood 15.2
wife 15.2
togetherness 15.1
relationship 15
pretty 14.7
relaxing 14.6
looking 14.4
loving 14.3
casual 13.6
relaxed 13.2
youth 12.8
playing 12.8
children 12.8
fun 12.7
joy 12.5
person 12.5
sofa 11.7
affectionate 11.6
couch 11.6
little 11.5
hair 11.1
adorable 11.1
model 10.9
face 10.7
bed 10.6
lady 10.6
resting 10.5
outdoors 10.5
son 10.1
old 9.8
affection 9.7
women 9.5
senior 9.4
rest 9.3
fashion 9.1
cheerful 8.9
caring 8.8
sexy 8.8
hug 8.7
elderly 8.6
relax 8.4
grandma 8.4
relaxation 8.4
mature 8.4
hand 8.4
care 8.2
girls 8.2
handsome 8
romantic 8
indoors 7.9
brunette 7.8
living room 7.8
play 7.8
summer 7.7
two 7.6
park 7.4
toy 7.4
room 7.3
group 7.3
gorgeous 7.3
aged 7.2
life 7.2

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 98.1
clothing 92
outdoor 88.7
text 88.1
black and white 87.3
human face 78.9
drawing 62.7
old 56.6
woman 55.4
man 52.9
sofa 36.3
picture frame 6.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 43-51
Gender Male, 99.7%
Calm 59.4%
Confused 31.5%
Angry 2.5%
Disgusted 2.1%
Happy 1.8%
Sad 1.4%
Fear 0.7%
Surprised 0.7%

AWS Rekognition

Age 31-41
Gender Female, 100%
Sad 72.7%
Confused 19.9%
Calm 2.8%
Fear 1.7%
Disgusted 1%
Angry 1%
Surprised 0.6%
Happy 0.3%

Microsoft Cognitive Services

Age 50
Gender Male

Microsoft Cognitive Services

Age 29
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Imagga

paintings art 96.1%
people portraits 3.3%

Captions

Microsoft
created on 2022-01-22

an old photo of a man 86.5%
a man sitting on a bench 64.7%
an old photo of a boy 64%