Human Generated Data

Title

Untitled (woman leaning over baby in bassinet)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12810

Human Generated Data

Title

Untitled (woman leaning over baby in bassinet)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12810

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.2
Baby 99.2
Newborn 99.2
Person 98.5
Furniture 94
Bed 93.7
Person 87.8
Female 81
Woman 68.1
Face 61.7
Girl 58

Clarifai
created on 2019-11-16

people 99.8
two 98.6
portrait 98.5
baby 98.4
woman 98.3
adult 98.1
child 97.6
monochrome 96.9
girl 96.5
family 95.3
love 94.6
street 92.4
man 92.2
group 90.8
son 89.7
one 89.6
nude 89.1
furniture 88.6
offspring 88.6
kiss 88.4

Imagga
created on 2019-11-16

couple 30.5
people 28.4
happy 26.9
love 26.8
adult 25.3
man 24.2
happiness 23.5
male 22.2
home 21.5
person 19.9
child 19.7
two 18.6
mother 17.6
bed 17.4
bride 17.3
portrait 16.8
attractive 16.1
family 16
relationship 15.9
smiling 15.9
together 14.9
salon 14.7
wedding 14.7
groom 14.6
husband 14.5
lifestyle 14.5
loving 14.3
looking 13.6
parent 13.6
dress 13.6
smile 13.5
indoors 13.2
hospital 12.9
sexy 12.8
face 12.8
casual 12.7
daughter 12.5
romance 12.5
bedroom 12.5
kid 12.4
pretty 11.9
women 11.9
romantic 11.6
wife 11.4
model 10.9
room 10.7
affectionate 10.6
boyfriend 10.6
girlfriend 10.6
cheerful 10.6
hairdresser 10.5
men 10.3
elegance 10.1
relaxation 10
indoor 10
girls 10
clothing 9.9
fashion 9.8
lady 9.7
interior 9.7
brunette 9.6
hair 9.5
marriage 9.5
sitting 9.4
cute 9.3
leisure 9.1
brother 8.7
affection 8.7
patient 8.7
device 8.6
married 8.6
comfortable 8.6
females 8.5
youth 8.5
one 8.2
children 8.2
style 8.2
childhood 8.1
celebration 8
boy 7.8
embracing 7.8
old 7.7
adults 7.6
togetherness 7.5
house 7.5
black 7.5
joy 7.5
human 7.5
vintage 7.4
holding 7.4
20s 7.3
kin 7.3
handsome 7.1
day 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

baby 97.9
human face 97.5
toddler 95.6
clothing 94.4
person 91
smile 88.4
indoor 87.8
black and white 85.9
text 80
woman 75.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Female, 98.9%
Fear 0.3%
Confused 0.4%
Happy 92.5%
Angry 0.2%
Disgusted 0.2%
Calm 2.6%
Surprised 0.2%
Sad 3.6%

AWS Rekognition

Age 0-4
Gender Female, 94.1%
Happy 1.5%
Angry 0.7%
Disgusted 0.3%
Confused 5.4%
Sad 0.8%
Calm 73.8%
Surprised 17.4%
Fear 0.1%

Microsoft Cognitive Services

Age 0
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Bed 93.7%

Categories

Imagga

people portraits 84.8%
paintings art 12.6%
pets animals 1.8%

Text analysis

Google

7GETVE
7GETVE