Human Generated Data

Title

Sarah Vaughan

Date

1987-1988

People

Artist: Brian Lanker, American 1947 - 2011

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.293

Copyright

© Brian Lanker

Human Generated Data

Title

Sarah Vaughan

People

Artist: Brian Lanker, American 1947 - 2011

Date

1987-1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.293

Copyright

© Brian Lanker

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Musical Instrument 96.2
Musician 96.2
Human 96.2
Person 94.8
Leisure Activities 93
Piano 86.5
Pianist 86.5
Performer 86.5
Finger 72.7
Painting 72.5
Art 72.5
Lamp 64.3

Clarifai
created on 2018-02-10

people 99.5
portrait 98.4
one 98.2
monochrome 97
adult 96.3
woman 95.1
music 93.2
art 93
man 92.5
profile 88.7
shadow 86.5
musician 85.9
indoors 82.5
girl 81.1
light 77.8
dark 77.6
silhouette 76.5
facial expression 75.5
wear 74.7
singer 74.3

Imagga
created on 2018-02-10

black 38.4
person 34.9
sexy 26.5
adult 26
dark 25.1
model 24.9
man 23.5
portrait 22
male 21.4
people 21.2
attractive 21
fashion 19.6
studio 18.2
hair 18.2
lady 17.9
body 17.6
human 17.2
pretty 16.8
looking 15.2
broadcaster 15.2
face 14.9
harp 14.6
one 14.2
expression 13.7
skin 12.5
communicator 12.3
brunette 12.2
sensual 11.8
sensuality 11.8
passion 11.3
love 11
elegance 10.9
nude 10.7
piano 10.6
hand 10.6
serious 10.5
erotic 10.4
light 10.1
suit 10.1
cute 10
business 9.7
eyes 9.5
laptop 9.2
makeup 9.2
style 8.9
night 8.9
world 8.9
smiling 8.7
lifestyle 8.7
sexual 8.7
smile 8.6
modern 8.4
holding 8.3
make 8.2
handsome 8
guy 8
lovely 8
posing 8
device 8
working 8
businessman 7.9
sitting 7.7
computer 7.7
youth 7.7
seductive 7.6
lighting 7.6
happy 7.5
office 7.4
lips 7.4
back 7.3
blond 7.3
alone 7.3
gorgeous 7.3

Google
created on 2018-02-10

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-55
Gender Female, 99.8%
Sad 8%
Calm 48.9%
Happy 8.3%
Angry 3.7%
Confused 12.1%
Surprised 2.6%
Disgusted 16.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 94.8%
Painting 72.5%

Captions

Microsoft
created on 2018-02-10

a person in a dark room 76.7%
a person sitting in a dark room 57.9%
a person in a dark room 57.8%