Human Generated Data

Title

Family Squatting on FSA property, Caruthersville, Missouri

Date

1938

People

Artist: Russell Lee, American 1903 - 1986

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2975

Human Generated Data

Title

Family Squatting on FSA property, Caruthersville, Missouri

People

Artist: Russell Lee, American 1903 - 1986

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2975

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.4
Human 99.4
Person 99.1
Person 98.5
Person 98.4
People 93.6
Chair 93.6
Furniture 93.6
Clothing 93.5
Apparel 93.5
Pants 73.1
Family 67
Military Uniform 61.4
Military 61.4

Clarifai
created on 2023-10-26

people 99.9
child 98.9
portrait 98.3
group 98.3
adult 97.7
three 97.3
man 96.1
family 95.9
son 95
four 94.7
group together 94.7
two 94.7
offspring 94.4
woman 93.7
wear 91.6
military 90.5
war 88.4
retro 87.9
facial expression 87.7
administration 87.1

Imagga
created on 2022-01-22

kin 81.3
statue 40.1
sculpture 33.5
architecture 26.6
history 25.9
ancient 22.5
monument 22.4
religion 22.4
old 21.6
culture 20.5
stone 20.2
art 20.2
musical instrument 19.8
tourism 19.8
historical 18.8
accordion 17.7
travel 16.9
historic 16.5
religious 15.9
building 15.9
city 15
keyboard instrument 14.2
marble 13.6
god 13.4
wind instrument 13.2
famous 13
antique 13
world 13
temple 12.7
traditional 12.5
mother 11.9
landmark 11.7
man 11.4
detail 11.3
catholic 10.7
decoration 10.1
face 9.9
heritage 9.7
spirituality 9.6
church 9.3
vintage 9.1
soldier 8.8
saint 8.7
male 8.5
parent 8.4
people 8.4
tourist 8.2
figure 8.1
statues 7.9
carved 7.8
portrait 7.8
spiritual 7.7
grandfather 7.4
peace 7.3
dress 7.2
column 7.1
love 7.1

Google
created on 2022-01-22

Chair 85.6
Dress 85.1
Hat 78
Classic 76.2
Vintage clothing 74.1
Sitting 68.5
One-piece garment 65.7
Room 65.2
Stock photography 63.6
Monochrome 62.7
Event 62.5
History 62.2
Fun 61.2
Retro style 57
Family 54.7
Child 53.1
Art 52.7
Curtain 50.5

Microsoft
created on 2022-01-22

person 99.3
clothing 98.6
human face 96.6
baby 95.3
toddler 94.4
text 92.9
smile 92.7
outdoor 91.6
standing 86.8
old 85.1
child 84.8
posing 69.7
white 69.2
boy 66.1
family 57.5
clothes 20.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 31-41
Gender Male, 99.8%
Calm 82.1%
Angry 9%
Confused 5.8%
Sad 1.1%
Happy 0.7%
Surprised 0.7%
Disgusted 0.3%
Fear 0.3%

AWS Rekognition

Age 38-46
Gender Male, 99.7%
Angry 50.6%
Calm 41.2%
Confused 3.3%
Sad 1.6%
Disgusted 1.2%
Surprised 1.1%
Happy 0.7%
Fear 0.4%

AWS Rekognition

Age 2-8
Gender Female, 98%
Sad 69%
Fear 16.6%
Angry 13.4%
Calm 0.3%
Disgusted 0.2%
Confused 0.2%
Happy 0.2%
Surprised 0.1%

AWS Rekognition

Age 6-16
Gender Male, 99.8%
Calm 71.2%
Angry 17.6%
Confused 7.6%
Sad 1.9%
Happy 0.8%
Surprised 0.4%
Fear 0.3%
Disgusted 0.2%

Microsoft Cognitive Services

Age 42
Gender Male

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 4
Gender Female

Microsoft Cognitive Services

Age 5
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

paintings art 93.4%
people portraits 6.3%