Human Generated Data

Title

Untitled (child seated on table, draped background)

Date

c. 1920

People

Artist: Michael Disfarmer, American 1884 - 1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Neal and Susan Yanofsky, 2006.271

Human Generated Data

Title

Untitled (child seated on table, draped background)

People

Artist: Michael Disfarmer, American 1884 - 1959

Date

c. 1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Neal and Susan Yanofsky, 2006.271

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 97.8
Human 97.8
Face 94.7
Smile 81.1
Portrait 72.2
Photography 72.2
Photo 72.2
Clothing 71.8
Apparel 71.8
Indoors 70.8
Art 69.1
Room 64.1
People 60.7
Baby 60.4
Painting 58.8
Sitting 56.8
Kid 56.7
Child 56.7

Clarifai
created on 2023-10-29

portrait 99.4
people 99.1
child 98.7
art 97.4
retro 97.3
one 97.2
vintage 96.9
sepia 96.8
wear 96.8
painting 93.4
antique 93.1
paper 92.7
old 91.7
girl 91.2
sepia pigment 88.7
nostalgia 88.2
son 85.6
baby 85.5
dirty 84.7
adult 80.9

Imagga
created on 2022-03-05

child 63
world 38.1
person 21.1
portrait 20.1
sexy 19.3
face 18.5
juvenile 18.1
male 17.8
body 17.6
man 17.5
human 15
sculpture 14.8
model 14.8
black 13.8
adult 13.6
people 12.8
hair 12.7
fitness 12.7
attractive 12.6
statue 12.4
lifestyle 10.8
dress 10.8
muscular 10.5
historical 10.4
head 10.1
sport 9.9
art 9.9
mother 9.8
pretty 9.8
lady 9.7
bride 9.6
love 9.5
wall 9.4
youth 9.4
strength 9.4
athlete 9.2
healthy 8.8
smile 8.6
culture 8.5
skin 8.5
old 8.4
training 8.3
fashion 8.3
religion 8.1
posing 8
boy 7.8
eyes 7.7
grunge 7.7
health 7.6
stone 7.6
arm 7.6
strong 7.5
life 7.5
one 7.5
slim 7.4
wedding 7.4
sensuality 7.3
exercise 7.3
gorgeous 7.3
blackboard 7.2
parent 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

human face 98.4
text 97.7
toddler 97.1
wall 97
baby 96.5
clothing 96.4
child 91.4
person 90.3
old 86.7
smile 79.7
boy 75.7
black 72.5
posing 67.9
photograph 58
blackboard 55
picture frame 50.6
box 41.1
vintage 38

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-7
Gender Female, 99.8%
Calm 62.8%
Happy 33%
Confused 1.9%
Disgusted 1.1%
Angry 0.4%
Surprised 0.4%
Sad 0.3%
Fear 0.1%

Microsoft Cognitive Services

Age 3
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Painting
Person 97.8%
Painting 58.8%

Categories

Imagga

paintings art 94.2%
people portraits 5.7%