Human Generated Data

Title

Wife and children of sharecropper

Date

1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3072

Human Generated Data

Title

Wife and children of sharecropper

People

Artist: Ben Shahn, American 1898 - 1969

Date

1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3072

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 98.7
Human 98.7
Person 96.8
People 96.6
Person 92.6
Female 92.3
Family 89.1
Face 87.7
Dress 87.4
Clothing 87.4
Apparel 87.4
Teen 83.3
Woman 77.3
Girl 76.1
Kid 74.6
Child 74.6
Photography 73.7
Photo 73.7
Portrait 72.3
Building 69.1
Urban 66.3
Baby 65.9
Housing 62.5
Smile 58.8
Home Decor 58.4
Man 58.3
Outdoors 55.9

Clarifai
created on 2023-10-15

people 100
child 99.9
portrait 99.9
two 99.6
baby 99.5
offspring 99.5
family 99.3
son 99.2
three 99.2
adult 98.9
group 97.9
monochrome 97.8
sibling 97.4
affection 96.6
sepia 96.6
facial expression 96.2
interaction 95.4
four 95.2
wear 94.1
retro 94

Imagga
created on 2021-12-15

kin 82.5
mother 39.2
parent 30.1
child 28.5
portrait 26.5
people 21.2
man 20.2
male 18.6
old 17.4
statue 17.2
sibling 16
face 15.6
adult 15.5
family 15.1
love 15
happiness 14.1
ancient 13.8
happy 13.8
daughter 13.6
sculpture 13.4
vintage 13.2
world 13
culture 12.8
hair 12.7
black 12.6
father 12.1
human 12
marble 11.6
lifestyle 11.6
couple 11.3
person 11.1
head 10.9
sepia 10.7
one 10.5
antique 10.4
youth 10.2
stone 10.1
girls 10
religion 9.9
outdoors 9.7
smiling 9.4
senior 9.4
cute 9.3
close 9.1
hand 9.1
art 9.1
aged 9.1
dad 9
closeup 8.8
women 8.7
model 8.6
money 8.5
grandma 8.5
religious 8.4
outdoor 8.4
retro 8.2
home 8
together 7.9
day 7.8
smile 7.8
eyes 7.7
sad 7.7
attractive 7.7
bride 7.7
married 7.7
two 7.6
park 7.4
historic 7.3
sexy 7.2
groom 7.2
currency 7.2
architecture 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

baby 99.2
person 98.9
human face 98.5
toddler 97.7
text 97.4
clothing 96.4
child 92.6
smile 87.9
boy 63.9
picture frame 24.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 7-17
Gender Female, 69.3%
Calm 97.9%
Sad 1.4%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%
Happy 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 32-48
Gender Female, 80.8%
Sad 57.8%
Calm 33.7%
Fear 2.8%
Confused 2.6%
Surprised 1.2%
Angry 1.1%
Disgusted 0.5%
Happy 0.3%

AWS Rekognition

Age 18-30
Gender Male, 84.2%
Calm 93.7%
Sad 5.6%
Surprised 0.2%
Angry 0.2%
Confused 0.1%
Happy 0.1%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 0-3
Gender Female, 94.2%
Calm 87.9%
Sad 8.8%
Fear 1.4%
Angry 1%
Happy 0.3%
Surprised 0.3%
Confused 0.2%
Disgusted 0.1%

Microsoft Cognitive Services

Age 27
Gender Female

Microsoft Cognitive Services

Age 7
Gender Female

Microsoft Cognitive Services

Age 1
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Categories

Imagga

people portraits 66.5%
paintings art 32.4%