Human Generated Data

Title

Untitled (children of unemployed trappers, Plaquemines Parish, Louisiana)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1347

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (children of unemployed trappers, Plaquemines Parish, Louisiana)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1347

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Face 100
Head 100
Photography 100
Portrait 100
Person 99.1
Boy 99.1
Child 99.1
Male 99.1
Person 99
Wood 97.2
People 93.8
Happy 93.5
Smile 93.5
Slum 89.4
Outdoors 61.9
Clothing 57.3
Shirt 57.3
Body Part 57.2
Finger 57.2
Hand 57.2
Countryside 56
Nature 56
Hat 55.7
Rural 55.4

Clarifai
created on 2018-05-11

people 100
child 99.8
group 99.3
portrait 99
adult 98.7
two 97.6
son 97.5
boy 96.7
sibling 95.5
three 95.5
woman 95.5
family 95.2
man 95
offspring 94.9
facial expression 94.3
wear 93.7
home 90.1
sit 88.2
five 87.4
four 86.2

Imagga
created on 2023-10-05

child 100
juvenile 78.3
person 50.9
family 43.6
parent 37.9
kid 37.2
happy 35.7
boy 34.8
mother 32.8
childhood 31.4
children 31
father 29.6
love 28.4
cute 27.3
little 26.5
people 26.2
male 26.2
son 26
kids 25.4
portrait 25.3
happiness 25.1
smiling 24.6
smile 24.2
dad 23.2
together 21.9
baby 21.7
daughter 21.3
park 20.6
playing 20.1
fun 19.5
toddler 19.1
lifestyle 18.8
outdoors 18.7
home 18.4
care 17.3
togetherness 17
joy 16.7
adorable 16.6
face 16.3
play 15.5
sitting 15.5
man 14.8
sister 14.7
youth 14.5
looking 14.4
hug 13.6
brother 12.9
outside 12.8
casual 12.7
two 12.7
black 12.6
infant 12.5
loving 12.4
enjoying 12.3
girls 11.9
adult 11.6
eyes 11.2
old 11.2
blond 10.8
boys 10.7
affectionate 10.7
cheerful 10.6
couple 10.5
life 10.5
innocent 9.8
mom 9.7
affection 9.7
diversity 9.6
school 9.6
friends 9.4
expression 9.4
senior 9.4
garden 9.2
joyful 9.2
holding 9.1
sweet 8.7
innocence 8.7
generation 8.6
playful 8.5
laughing 8.5
enjoy 8.5
house 8.4
hold 8.3
grass 7.9
grandchild 7.9
day 7.8
newborn 7.8
summer 7.7
attractive 7.7
race 7.6
husband 7.6
healthy 7.6
indoor 7.3
group 7.3
colorful 7.2
offspring 7.1
interior 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.6
wooden 70

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 13-21
Gender Female, 99.9%
Calm 98.5%
Surprised 6.3%
Fear 6%
Sad 2.2%
Angry 0.4%
Happy 0.3%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 6-14
Gender Female, 99.4%
Calm 57.3%
Fear 43.7%
Surprised 6.9%
Sad 3.1%
Angry 0.5%
Confused 0.3%
Happy 0.3%
Disgusted 0.3%

Microsoft Cognitive Services

Age 6
Gender Female

Microsoft Cognitive Services

Age 5
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Boy 99.1%
Child 99.1%
Male 99.1%

Categories