Human Generated Data

Title

Untitled (portrait of two babies sitting in chair)

Date

1930-1940

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10237

Human Generated Data

Title

Untitled (portrait of two babies sitting in chair)

People

Artist: Martin Schweig, American 20th century

Date

1930-1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10237

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Furniture 99.9
Person 94.6
Human 94.6
Person 91.1
Crib 82.3
Cradle 78.2
Newborn 74.3
Baby 74.3

Clarifai
created on 2023-10-26

child 100
baby 99.9
son 99.7
people 99.5
family 99.3
portrait 99.1
sibling 97.9
sepia 97.9
two 97.2
love 96.7
offspring 95.9
girl 95.4
monochrome 95.1
three 94.2
group 93.1
cute 92.8
little 92.7
newborn 91.9
affection 91.8
furniture 87.3

Imagga
created on 2022-01-22

cradle 100
baby bed 100
furniture 100
child 73.2
furnishing 68.7
baby 67.5
kid 48.8
family 47.2
little 44.2
childhood 43
infant 39.6
cute 39.5
newborn 37
mother 34.2
love 33.2
happiness 31.4
face 30.6
boy 30.5
happy 30.1
parent 29.1
children 28.3
care 28
daughter 27.9
portrait 27.9
toddler 26.8
adorable 25
father 24.5
people 22.9
joy 22.6
fun 21.7
innocence 21.2
son 20.9
innocent 20.4
smiling 20.3
person 19.9
cheerful 19.5
male 19.3
eyes 19
kids 18.9
sweet 18.2
life 18
youth 17.9
smile 17.8
home 16.8
together 16.7
bed 15.2
human 15
loving 14.3
holding 14.1
play 13.8
lifestyle 13.8
expression 13.7
adult 13
joyful 12.9
looking 12.8
mom 12.6
parenthood 11.7
studio 11.4
attractive 11.2
skin 11
playing 11
man 10.8
motherhood 10.8
healthy 10.7
brother 10.7
sleep 10.7
hand 10.6
indoors 10.6
casual 10.2
dad 9.9
pretty 9.8
lovely 9.8
health 9.7
one 9.7
affectionate 9.7
hair 9.5
sitting 9.5
funny 9.2
new 8.9
color 8.9
sister 8.8
sleeping 8.8
husband 8.6
close 8.6
offspring 8.5
lying 8.5
black 8.4
head 8.4
leisure 8.3
girls 8.2
look 7.9
hug 7.8
two 7.6
playful 7.6
laughing 7.6
togetherness 7.6
clean 7.5
toy 7.4
soft 7.2
body 7.2

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 99.9
toddler 99.3
baby 99
indoor 97.3
human face 96.3
child 95
text 91.9
boy 83
clothing 79.9
infant 76.9
newborn 68.9
high 23

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 0-3
Gender Female, 99.9%
Angry 76.2%
Calm 10.2%
Surprised 5.6%
Disgusted 2.6%
Sad 2.1%
Fear 1.8%
Confused 1.1%
Happy 0.4%

AWS Rekognition

Age 0-3
Gender Female, 88.9%
Calm 98.1%
Confused 0.7%
Angry 0.6%
Surprised 0.4%
Happy 0.1%
Disgusted 0.1%
Sad 0%
Fear 0%

Microsoft Cognitive Services

Age 0
Gender Female

Microsoft Cognitive Services

Age 0
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 94.6%
Crib 82.3%

Categories

Imagga

people portraits 99.8%

Captions

Microsoft
created on 2022-01-22

a person holding a baby 89.9%
a person holding a baby 89.6%
a person holding a baby 80.5%

Text analysis

Amazon

Schweig
Studio
Studio Pro
Pro

Google

Schweig S
Schweig
S