Human Generated Data

Title

Ozark Children, Arkansas

Date

1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3067

Human Generated Data

Title

Ozark Children, Arkansas

People

Artist: Ben Shahn, American 1898 - 1969

Date

1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3067

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.7
Human 99.7
Person 98.7
Clothing 87.2
Apparel 87.2
People 86.2
Face 77.2
Pants 74.7
Person 66
Portrait 65.2
Photography 65.2
Photo 65.2
Smile 62.5
Kid 60.2
Child 60.2
Long Sleeve 59.5
Sleeve 59.5
Finger 59.3
Wood 56.6

Clarifai
created on 2023-10-15

child 100
son 99.9
people 99.9
family 99.8
portrait 99.6
baby 99.2
two 99.1
offspring 98.5
three 98.5
sibling 98.1
group 97.6
boy 96.6
four 95.6
man 93.8
retro 93.3
adult 92.6
nostalgia 92.1
sit 92
sepia 91.3
monochrome 91

Imagga
created on 2021-12-15

child 35.4
mother 32.2
people 26.2
parent 25.6
man 25.5
male 24.8
family 24
kin 23.2
happy 22.6
portrait 22
person 21.5
happiness 21.2
couple 20
love 19.7
smiling 18.1
father 17.5
home 16.7
adult 16.3
youth 15.3
senior 15
face 14.9
together 14.9
world 14.8
old 14.6
smile 14.3
lifestyle 13.7
dad 13.5
kid 13.3
sibling 13
husband 12.4
women 11.9
joy 11.7
childhood 11.6
park 11.5
married 11.5
elderly 11.5
boy 11.3
daughter 11.2
aged 10.9
dress 10.8
retired 10.7
human 10.5
outdoors 10.5
blond 10.4
black 10.2
casual 10.2
girls 10
outdoor 9.9
lady 9.7
fun 9.7
sepia 9.7
wife 9.5
relationship 9.4
cute 9.3
two 9.3
clothing 9.3
head 9.2
attractive 9.1
holding 9.1
care 9.1
bench 9
son 8.8
look 8.8
bride 8.6
sitting 8.6
friends 8.5
pretty 8.4
playing 8.2
children 8.2
cheerful 8.1
school 8.1
romantic 8
looking 8
interior 8
little 7.9
baby 7.9
day 7.8
antique 7.8
ancient 7.8
men 7.7
retirement 7.7
enjoying 7.6
females 7.6
laughing 7.6
room 7.6
togetherness 7.6
one 7.5
vintage 7.4
retro 7.4
group 7.3
handsome 7.1
juvenile 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 98.9
clothing 98.8
person 98.6
human face 98.5
baby 97.7
toddler 96.6
child 92.6
window 91.7
boy 86
smile 85.9
old 51.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 5-15
Gender Male, 82.3%
Calm 93.3%
Sad 5.7%
Angry 0.6%
Confused 0.2%
Fear 0.1%
Surprised 0.1%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 20-32
Gender Male, 77.7%
Calm 72.9%
Sad 25.5%
Angry 0.6%
Confused 0.3%
Surprised 0.2%
Fear 0.2%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 0-4
Gender Female, 97.1%
Calm 90.9%
Sad 8%
Surprised 0.5%
Confused 0.3%
Happy 0.1%
Fear 0.1%
Angry 0.1%
Disgusted 0%

Microsoft Cognitive Services

Age 26
Gender Female

Microsoft Cognitive Services

Age 7
Gender Female

Microsoft Cognitive Services

Age 2
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories

Imagga

people portraits 68.2%
paintings art 29.5%
pets animals 1.4%