Human Generated Data

Title

Untitled (portrait of woman looking to left with hands clasped by neck)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12793

Human Generated Data

Title

Untitled (portrait of woman looking to left with hands clasped by neck)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12793

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 91.1
Person 91.1
Finger 89.8
Hair 59.8
Screen 57.2
Display 57.2
Monitor 57.2
Electronics 57.2
Neck 56.5
Head 55.8

Clarifai
created on 2019-11-16

people 99.6
portrait 99.5
monochrome 99.4
woman 97.7
one 97.7
adult 96.4
music 95.4
girl 94.8
profile 94.7
musician 89.7
model 89.6
singer 88.1
fashion 84.3
side view 83.4
street 82.8
analogue 82.1
man 81.7
dark 80.8
actress 80.4
black and white 80.3

Imagga
created on 2019-11-16

portrait 40.8
black 39.4
face 37.7
person 34.1
adult 32.4
lips 30.6
attractive 30.1
model 29.6
sexy 27.3
hair 26.2
fashion 23.4
pretty 23.1
people 21.8
eyes 21.5
head 20.2
dark 19.2
sensual 19.1
studio 19
skin 18.8
expression 18.8
lady 18.7
human 18
makeup 17.1
close 16.6
sensuality 16.4
make 16.4
style 16.3
world 16.2
women 15.8
cute 15.8
one 15.7
man 15
male 14.3
brunette 14
posing 13.3
smile 12.8
looking 12.8
background 12.5
art 11.8
eye 11.6
serious 11.5
look 11.4
elegance 10.9
lifestyle 10.9
body 10.4
desire 9.6
facial 9.6
hands 9.6
screen 9.5
closeup 9.4
happy 9.4
youth 9.4
cosmetics 9.4
long 9.2
device 9.2
hairstyle 8.6
elegant 8.6
adolescent 8.5
casual 8.5
juvenile 8.4
hand 8.4
girls 8.2
blond 8.2
display 8
smiling 8
love 7.9
sculpture 7.7
smasher 7.6
bust 7.4
emotion 7.4
cheerful 7.3

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

person 98.6
text 98.3
human face 97.1
black and white 91.9
face 87.8
portrait 80.6
monochrome 74.2
black 65.1
staring 24.3
picture frame 6.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-39
Gender Female, 89.5%
Confused 0.3%
Fear 0.6%
Sad 4.8%
Disgusted 0.5%
Happy 0.7%
Angry 0.8%
Calm 91.8%
Surprised 0.5%

Feature analysis

Amazon

Person 91.1%

Categories

Imagga

paintings art 97.9%
pets animals 1.1%

Captions