Human Generated Data

Title

Untitled (portrait of woman)

Date

c. 1935-1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4396

Human Generated Data

Title

Untitled (portrait of woman)

People

Artist: Durette Studio, American 20th century

Date

c. 1935-1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4396

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.4
Human 99.4
Face 96.6
Apparel 72.1
Clothing 72.1
Photo 64.9
Photography 64.9
Portrait 64.9
Sleeve 62.8
Head 61.2
Home Decor 58.8

Clarifai
created on 2019-06-01

people 99.1
portrait 98.8
one 97.4
adult 96.5
man 96.1
old 91.6
person 89.9
elderly 86.6
woman 86.3
art 84.6
mustache 84.5
desktop 82.8
wear 82.5
monochrome 82.4
vertical 78.9
retro 77.9
face 77.3
music 74.5
facial hair 74.2
hair 72.9

Imagga
created on 2019-06-01

portrait 33.7
person 32.9
face 29.9
man 28.9
adult 24.6
senior 24.4
male 24.1
black 23.6
people 23.4
grandma 22.5
looking 22.4
oxygen mask 22.1
old 20.9
expression 20.5
head 19.3
crazy 19.2
hair 19
mask 18.9
one 17.9
device 17.9
breathing device 17.8
elderly 16.3
human 15.8
eyes 14.6
covering 14.2
happy 13.2
mature 13
lady 13
studio 12.9
cap 12.8
attractive 12.6
shower cap 11.5
look 11.4
clothing 11.3
alone 11
model 10.9
eye 10.7
pretty 10.5
serious 10.5
skin 10.4
happiness 10.2
glasses 10.2
smile 10
aged 10
beard 10
men 9.5
emotion 9.2
close 9.1
hand 9.1
fashion 9.1
wrinkle 8.9
depression 8.8
hands 8.7
dark 8.4
make 8.2
closeup 8.1
lifestyle 8
businessman 8
older 7.8
retired 7.8
sad 7.7
spiritual 7.7
casual 7.6
age 7.6
negative 7.5
fun 7.5
mustache 7.4
attire 7.3
bust 7.3
smiling 7.2
body 7.2
headdress 7.2
handsome 7.1
posing 7.1
love 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

human face 98.2
indoor 96.9
wall 96.7
person 94.8
man 92.1
portrait 85.3
clothing 78.3
black and white 78.2
necklace 52.8
staring 29.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 45-63
Gender Male, 96.1%
Happy 4.2%
Disgusted 4.8%
Angry 7.3%
Surprised 8.2%
Sad 30.4%
Calm 38.9%
Confused 6.2%

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

paintings art 99.8%

Text analysis

Amazon

MJ13