Human Generated Data

Title

Untitled (two babies sitting on bed against wall)

Date

1930-1940

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10194

Human Generated Data

Title

Untitled (two babies sitting on bed against wall)

People

Artist: Martin Schweig, American 20th century

Date

1930-1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Baby 98
Human 98
Newborn 98
Face 96.1
Person 93.4
Person 93
Photo 75.3
Photography 75.3
Portrait 75.3
People 69.4
Smile 60.3

Imagga
created on 2022-01-22

child 100
sibling 73.1
baby 55.5
family 49
kid 46.2
mother 42.7
little 41.6
cute 40.3
childhood 39.5
boy 39.2
parent 39
children 37.4
son 36
happy 34.5
love 34
happiness 33
daughter 32.6
toddler 31.6
father 29
infant 29
adorable 27.8
kids 27.4
portrait 27.2
fun 27
cheerful 25.3
male 24.8
smiling 24.7
care 23.1
face 22.8
joy 21.8
sitting 21.5
people 21.2
together 19.3
play 19
smile 18.6
innocent 17.9
girls 17.4
innocence 17.3
brother 16.9
human 16.5
offspring 16.3
person 16
dad 15.9
newborn 15.6
playing 15.5
home 15.2
togetherness 15.1
adult 14.9
holding 14.9
toy 14.8
two 14.4
playful 14.2
lifestyle 13.8
blond 13.5
laughing 13.3
eyes 12.9
joyful 12.9
looking 12.8
mom 12.6
loving 12.4
expression 12
sweet 11.9
parenthood 11.7
boys 11.7
affectionate 11.6
affection 11.6
indoors 11.4
healthy 11.4
youth 11.1
casual 11
indoor 11
man 10.8
hug 10.7
clothing 10.5
one 10.5
enjoyment 10.3
life 10.2
hold 10.2
leisure 10
generations 9.9
kin 9.8
sister 9.8
husband 9.6
studio 9.1
grandfather 9
color 8.9
preschooler 8.8
females 8.5
friends 8.5
black 8.4
funny 8.3
women 7.9
mommy 7.9
couple 7.9
hands 7.8
embracing 7.8
attractive 7.7
diversity 7.7
generation 7.7
old 7.7
lifestyles 7.6
enjoying 7.6
enjoy 7.5
senior 7.5
outdoors 7.5
grandma 7.3
game 7.1
lovely 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

toddler 99
person 98.9
child 97.4
human face 94
text 93.7
baby 92.7
indoor 88.8
boy 81.4
clothing 80.4
smile 54.1
seat 35.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 0-3
Gender Female, 100%
Happy 99.9%
Calm 0%
Surprised 0%
Sad 0%
Angry 0%
Disgusted 0%
Fear 0%
Confused 0%

AWS Rekognition

Age 0-3
Gender Female, 100%
Surprised 99.7%
Calm 0.1%
Happy 0.1%
Fear 0.1%
Confused 0%
Disgusted 0%
Angry 0%
Sad 0%

Microsoft Cognitive Services

Age 0
Gender Female

Microsoft Cognitive Services

Age 0
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Possible
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 93.4%

Captions

Microsoft

a vintage photo of a baby 95.7%
a vintage photo of a person holding a baby 89.8%
a vintage photo of a baby holding a stuffed animal 77.1%

Text analysis

Amazon

MARTIN
AND
TEN
OF
BE
WITHIN
THIS
THE
WITHIN TEN DAYS
OF THE
RETURNED
DAYS
AND SHOULD BE RETURNED
PROPERTY
SHOULD
STUDIO
THIS PROOPOS THE PROPERTY
PROOPOS
MARTIN SCHWEIZ STUDIO
SCHWEIZ

Google

THIS PROOF IS THE PROPER OF TE MARTIS SCHWE UD AND SHOULD BE RETURNED LAITHIN TEN DAYS
IS
TE
AND
DAYS
THE
SCHWE
PROOF
OF
MARTIS
UD
TEN
THIS
PROPER
SHOULD
BE
RETURNED
LAITHIN