Human Generated Data

Title

Untitled (portrait of older woman seated in chair with lacy scarf and hands on lap)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12784

Human Generated Data

Title

Untitled (portrait of older woman seated in chair with lacy scarf and hands on lap)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12784

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99
Human 99
Apparel 97.3
Clothing 97.3
Face 90.8
Sitting 89.1
Female 88.5
Woman 77.3
Child 76.9
Teen 76.9
Girl 76.9
Kid 76.9
Blonde 76.9
Sleeve 75
Coat 68.3
Overcoat 68.3
Photography 66.8
Photo 66.8
Portrait 66.8
Suit 66.5
Finger 65.8
Door 64.6

Clarifai
created on 2019-11-16

people 99.8
portrait 99.2
adult 98.1
one 97.2
woman 97
music 95.7
profile 94
man 92.6
musician 90.1
window 88.8
sit 88.3
wear 87.8
indoors 86.8
actress 85.8
singer 85.3
administration 84.8
light 82.3
actor 81.9
monochrome 81.3
writer 79.6

Imagga
created on 2019-11-16

black 39.1
sexy 34.6
person 34.3
adult 31.3
portrait 29.8
model 29.6
attractive 27.3
dark 24.2
pretty 23.1
lady 22.7
body 22.4
people 22.3
man 22.2
fashion 20.4
face 19.9
male 19.9
hair 19.8
human 18.8
call 18.1
one 17.9
posing 17.8
sensual 16.4
lifestyle 15.9
sensuality 14.5
looking 14.4
microphone 14.3
passion 14.1
brunette 13.9
elegant 13.7
skin 13.7
dress 13.6
couple 13.1
lingerie 13
expression 12.8
make 12.7
love 12.6
style 12.6
erotic 12.5
romantic 12.5
studio 12.2
suit 11.6
look 11.4
girlfriend 10.6
eyes 10.3
healthy 10.1
elegance 10.1
alone 10.1
gorgeous 10
pose 10
hand 9.9
handsome 9.8
hugging 9.8
sexual 9.6
youth 9.4
relationship 9.4
two 9.3
device 9.3
room 9.3
night 8.9
bra 8.8
happy 8.8
boyfriend 8.7
cute 8.6
serious 8.6
business 8.5
slim 8.3
holding 8.3
blond 8.1
romance 8
lovely 8
performer 7.9
indoors 7.9
closeness 7.9
happiness 7.8
standing 7.8
flirt 7.8
underwear 7.7
modern 7.7
muscular 7.6
relaxation 7.5
lips 7.4
training 7.4
emotion 7.4
light 7.4
grand piano 7.4
back 7.4
star 7.3
confident 7.3
fitness 7.2
professional 7.1
smile 7.1
interior 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

human face 98.9
wall 95.3
indoor 93.9
clothing 92.4
person 90.2
text 88.7
woman 81.6
portrait 59.6
black and white 52.9
display 25.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 43-61
Gender Female, 97.6%
Angry 26.5%
Happy 7.2%
Disgusted 3.5%
Calm 45.2%
Fear 3%
Surprised 1.1%
Sad 6.4%
Confused 7.1%

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Categories