Human Generated Data

Title

Untitled (man and woman in formal attire seen close-up, she with flowers in hair, he with handkerchief in pocket)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12802

Human Generated Data

Title

Untitled (man and woman in formal attire seen close-up, she with flowers in hair, he with handkerchief in pocket)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12802

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Clothing 99.9
Apparel 99.9
Person 98.2
Human 98.2
Person 97.4
Robe 96.3
Evening Dress 96.3
Gown 96.3
Fashion 96.3
Face 92.2
Suit 80.8
Overcoat 80.8
Coat 80.8
Wedding 77.6
Female 72.4
Wedding Gown 70.3
Photography 69.5
Portrait 69.5
Photo 69.5
Sleeve 65.3
Indoors 64.7
Interior Design 64.7
Tuxedo 62
Woman 57.5
Bride 57
Finger 55.7

Clarifai
created on 2019-11-16

people 99.9
portrait 99.8
monochrome 99
adult 98.3
man 98.2
two 98.1
woman 96.6
affection 96.3
actress 96.1
couple 95.3
facial expression 94.1
wedding 93.8
love 93.7
music 93.1
wear 92.3
girl 91.5
group 89.6
retro 85.3
son 84.1
musician 83.5

Imagga
created on 2019-11-16

couple 46.2
love 43.5
male 37.2
man 37
portrait 35.6
happy 33.3
attractive 32.2
people 31.3
adult 29.1
two 28.8
romance 27.7
relationship 26.3
parent 26.1
happiness 25.9
together 25.4
dad 24.1
groom 23.6
smile 23.5
smiling 23.2
person 22.4
boyfriend 22.2
father 21.8
lifestyle 21.7
adolescent 21.5
daughter 21.1
girlfriend 20.2
romantic 19.6
pretty 18.9
sexy 18.5
face 18.5
cute 18
mother 17.9
child 17.4
husband 17
family 16.9
married 16.3
black 15.9
juvenile 15.9
women 15.8
casual 15.3
boy 14.8
son 14.7
sibling 14
men 13.8
fashion 13.6
handsome 13.4
marriage 13.3
expression 12.8
fun 12.7
lovers 12.6
joy 12.5
model 12.5
loving 12.4
wife 12.3
lady 12.2
looking 12
hair 11.9
sensual 11.8
dating 11.7
hug 11.6
erotic 11.5
pair 11.3
passion 11.3
youth 11.1
bow tie 10.8
darling 10.7
affectionate 10.7
studio 10.7
brunette 10.5
wedding 10.1
human 9.8
tenderness 9.8
affection 9.7
adults 9.5
togetherness 9.5
friends 9.4
skin 9.3
teenager 9.1
sensuality 9.1
holding 9.1
cheerful 9
world 8.9
kid 8.9
passionate 8.8
bride 8.8
body 8.8
necktie 8.8
hugging 8.8
engagement 8.7
eyes 8.6
ethnic 8.6
close 8.6
elegance 8.4
lips 8.3
emotion 8.3
teen 8.3
care 8.2
outdoors 8.2
girls 8.2
guy 8.2
valentine 8.2
home 8
look 7.9
heterosexual 7.9
faces 7.8
kiss 7.8
brother 7.8
sex 7.8
sexual 7.7
diversity 7.7
suit 7.4
business 7.3
group 7.3
dress 7.2
lovely 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

wall 99.9
person 99.4
smile 98.3
human face 97.8
indoor 96.4
woman 95.7
text 92.5
posing 88.6
bride 79.3
wedding dress 76.7
clothing 76.2
black and white 72.9
portrait 59.3
face 58.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Female, 98.8%
Surprised 0.5%
Angry 0.4%
Happy 95.2%
Fear 0.1%
Calm 2.2%
Disgusted 1.2%
Sad 0.1%
Confused 0.3%

AWS Rekognition

Age 32-48
Gender Male, 95%
Happy 89.2%
Fear 0.6%
Sad 0.3%
Confused 1.9%
Disgusted 0.5%
Surprised 1.5%
Calm 4.2%
Angry 1.6%

Microsoft Cognitive Services

Age 45
Gender Male

Microsoft Cognitive Services

Age 39
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.2%

Categories