Human Generated Data

Title

Untitled (Christmas card image, candles with children’s heads superimposed)

Date

1956

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18611

Human Generated Data

Title

Untitled (Christmas card image, candles with children’s heads superimposed)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18611

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Icing 98.1
Food 98.1
Dessert 98.1
Cake 98.1
Cream 98.1
Creme 98.1
Nature 97.2
Outdoors 95.8
Person 93.7
Human 93.7
Sand 89.8
Person 84.8
Figurine 84.4
Person 78.6
Person 77.6
Head 73.8
Pottery 62.7

Clarifai
created on 2023-10-22

child 99.5
people 99.1
two 97.2
group 97
three 96.1
art 95
portrait 94.2
son 92.3
monochrome 91.5
illustration 90.8
sibling 89.7
wear 86.6
family 86.1
retro 84.7
four 83.7
boy 83
facial expression 83
fun 82.7
lid 82.6
winter 82.4

Imagga
created on 2022-02-25

kin 42.7
portrait 22
people 17.8
man 17.5
black 17.4
child 17.1
sibling 16.8
body 16.8
face 16.3
person 15.3
male 14.9
adult 13.6
love 12.6
head 12.6
dark 12.5
cemetery 12.2
youth 11.9
happy 11.9
decoration 11.5
fashion 11.3
human 11.2
sexy 11.2
dress 10.8
sport 10.7
costume 10.5
boy 10.4
eyes 10.3
hair 10.3
girls 10
world 9.7
women 9.5
happiness 9.4
culture 9.4
grunge 9.4
model 9.3
makeup 9.1
pretty 9.1
old 9.1
one 9
brother 8.9
mask 8.8
couple 8.7
scary 8.7
art 8.6
traditional 8.3
fun 8.2
religion 8.1
attractive 7.7
outdoor 7.6
two 7.6
relaxation 7.5
joy 7.5
decorative 7.5
vintage 7.4
disguise 7.4
smiling 7.2
color 7.2
celebration 7.2
smile 7.1
grass 7.1
night 7.1
mother 7.1
look 7

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

text 95.5
human face 90.8
baby 89.8
toddler 88.9
smile 84.6
person 77
child 71
picture frame 30.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-7
Gender Female, 98.1%
Happy 98.3%
Confused 0.5%
Surprised 0.5%
Fear 0.3%
Sad 0.2%
Disgusted 0.1%
Angry 0.1%
Calm 0.1%

AWS Rekognition

Age 6-14
Gender Female, 50.1%
Happy 99.3%
Confused 0.2%
Calm 0.1%
Disgusted 0.1%
Surprised 0.1%
Sad 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 6-12
Gender Male, 100%
Happy 92.2%
Calm 4.3%
Confused 1.2%
Sad 0.7%
Surprised 0.5%
Angry 0.5%
Disgusted 0.4%
Fear 0.3%

AWS Rekognition

Age 0-4
Gender Female, 99.7%
Happy 97.5%
Calm 1.3%
Confused 0.4%
Disgusted 0.2%
Surprised 0.2%
Sad 0.2%
Fear 0.1%
Angry 0.1%

Microsoft Cognitive Services

Age 8
Gender Female

Microsoft Cognitive Services

Age 5
Gender Male

Microsoft Cognitive Services

Age 5
Gender Male

Microsoft Cognitive Services

Age 7
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 93.7%
Person 84.8%
Person 78.6%
Person 77.6%

Categories

Imagga

paintings art 92.5%
food drinks 2.6%
pets animals 2%

Text analysis

Amazon

ODAA-