Human Generated Data

Title

Leddie and Children

Date

1990

People

Artist: Shelby Lee Adams, American born 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Purchased from Gahan Fund, 2.2002.28

Copyright

© Shelby Lee Adams Archive

Human Generated Data

Title

Leddie and Children

People

Artist: Shelby Lee Adams, American born 1950

Date

1990

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Purchased from Gahan Fund, 2.2002.28

Copyright

© Shelby Lee Adams Archive

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.7
Human 99.7
Person 99.7
Person 98.8
Person 98.7
Person 98.7
Person 98.6
Person 98.5
Person 97.9
Person 97.8
People 95.2
Person 85.4
Family 84.4
Urban 78.5
Building 64.2
Porch 62.3
Clothing 59.6
Apparel 59.6
Shorts 59.4
Face 58.9

Clarifai
created on 2023-10-25

people 99.9
child 99.9
group 99.7
group together 98.9
monochrome 98.6
son 98.2
family 97.7
adult 95.6
sibling 95.4
offspring 95.1
woman 93.7
three 93.3
facial expression 92.5
baby 92
man 91.7
portrait 91.6
four 91.6
many 91.3
several 90.9
boy 90.4

Imagga
created on 2022-01-08

totem pole 42.8
column 36.1
structure 26.4
statue 21.6
sculpture 19.7
pillory 19.4
instrument 18.2
man 17.5
instrument of punishment 16.6
ancient 16.4
religion 16.1
old 16
child 15.3
art 15.2
architecture 14.8
device 14.7
people 14.5
male 13.6
temple 13.6
person 13.5
human 12.7
one 12.7
god 12.4
adult 12.3
world 12.2
culture 11.1
antique 10.4
city 10
vintage 9.9
travel 9.9
history 9.8
religious 9.4
stone 9.3
relaxation 9.2
black 9
outdoors 9
building 8.8
women 8.7
lifestyle 8.7
carving 8.6
girls 8.2
guillotine 8.1
park 8.1
body 8
hair 7.9
face 7.8
portrait 7.8
instrument of execution 7.7
window 7.6
famous 7.4
water 7.3
peace 7.3
sensuality 7.3
dirty 7.2
family 7.1
love 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 98.4
posing 96.6
child 96.1
outdoor 95.9
clothing 95.8
toddler 95.6
text 95.3
human face 95.2
baby 94.2
boy 85.2
people 79.2
group 78.3
smile 76.3
girl 71.3
old 46.5
family 27.4
picture frame 18.1
crowd 0.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 14-22
Gender Male, 95.7%
Sad 45.7%
Fear 29.9%
Calm 22.4%
Surprised 0.7%
Angry 0.4%
Disgusted 0.4%
Confused 0.3%
Happy 0.1%

AWS Rekognition

Age 7-17
Gender Male, 100%
Calm 98.7%
Sad 0.8%
Angry 0.2%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%
Confused 0%
Happy 0%

AWS Rekognition

Age 9-17
Gender Female, 100%
Sad 78.5%
Calm 17.6%
Angry 1.8%
Confused 1.4%
Fear 0.3%
Disgusted 0.2%
Surprised 0.2%
Happy 0.1%

AWS Rekognition

Age 47-53
Gender Female, 100%
Calm 68.5%
Happy 19.5%
Surprised 5.2%
Disgusted 1.9%
Fear 1.6%
Confused 1.5%
Angry 1.1%
Sad 0.6%

AWS Rekognition

Age 6-14
Gender Male, 98.9%
Happy 99.5%
Confused 0.2%
Surprised 0.1%
Angry 0%
Fear 0%
Calm 0%
Sad 0%
Disgusted 0%

AWS Rekognition

Age 4-10
Gender Male, 86.4%
Calm 73.5%
Sad 19.8%
Angry 5.3%
Confused 0.3%
Surprised 0.3%
Happy 0.3%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 6-14
Gender Female, 54.3%
Angry 99.7%
Sad 0.2%
Confused 0.1%
Calm 0%
Disgusted 0%
Fear 0%
Surprised 0%
Happy 0%

AWS Rekognition

Age 6-14
Gender Female, 100%
Calm 74%
Confused 7.7%
Happy 5%
Surprised 4.7%
Angry 4.4%
Sad 2.1%
Fear 1.2%
Disgusted 0.9%

AWS Rekognition

Age 0-3
Gender Female, 93.9%
Calm 63.7%
Sad 34.5%
Happy 1.1%
Angry 0.4%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 14-22
Gender Male, 88.3%
Calm 40.5%
Angry 34.9%
Sad 21.4%
Confused 1.1%
Fear 0.9%
Disgusted 0.4%
Surprised 0.4%
Happy 0.3%

Microsoft Cognitive Services

Age 9
Gender Female

Microsoft Cognitive Services

Age 7
Gender Female

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 55
Gender Female

Microsoft Cognitive Services

Age 5
Gender Female

Microsoft Cognitive Services

Age 10
Gender Male

Microsoft Cognitive Services

Age 19
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories

Imagga

paintings art 87.1%
people portraits 11.9%