Human Generated Data

Title

Marva Nettles Collins

Date

1987-1988

People

Artist: Brian Lanker, American 1947 - 2011

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.303

Copyright

© Brian Lanker

Human Generated Data

Title

Marva Nettles Collins

People

Artist: Brian Lanker, American 1947 - 2011

Date

1987-1988

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Person 99.1
Human 99.1
Face 86.8
Finger 64.3
Photo 60.5
Photography 60.5
Portrait 60.5
Blackboard 59.8
Room 58.6
Indoors 58.6
Hair 57

Clarifai
created on 2018-03-20

people 99.8
one 99.4
portrait 99
adult 98.7
man 97.6
music 95.3
indoors 94.8
wear 91.1
musician 90
side view 85.3
profile 82.6
woman 81.2
facial expression 80.1
writer 78.9
pianist 77.8
administration 77
sit 76.1
leader 72.1
singer 71.5
elderly 70.5

Imagga
created on 2018-03-20

man 47.7
male 45.5
person 42
black 35.9
portrait 34.3
adult 28.6
people 28.4
handsome 25
dark 22.5
face 22
serious 20
expression 17.1
one 16.4
looking 16
cipher 15.8
eyes 15.5
suit 15.4
attractive 15.4
human 15
guy 14.6
studio 14.4
professional 14.1
disk jockey 13.2
businessman 12.4
lifestyle 12.3
hand 12.1
business 12.1
men 12
confident 11.8
work 11.8
model 11.7
smiling 11.6
casual 11
alone 11
masculine 10.7
happy 10.7
broadcaster 10.6
look 10.5
standing 10.4
style 10.4
communicator 10.3
clothing 10.3
world 10.3
success 9.7
smile 9.3
head 9.2
to 8.8
sexy 8.8
hair 8.7
boy 8.7
hope 8.7
strong 8.4
power 8.4
fashion 8.3
executive 8.3
fitness 8.1
cool 8
close 8
couple 7.8
jacket 7.7
tie 7.6
closeup 7.4
posing 7.1

Google
created on 2018-03-20

Microsoft
created on 2018-03-20

person 97.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Female, 92.1%
Happy 0.5%
Angry 5.5%
Surprised 2%
Sad 3%
Disgusted 0.7%
Calm 85.6%
Confused 2.6%

Microsoft Cognitive Services

Age 58
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a man that is standing in the dark 76%
a man standing in a dark room 72.4%
a man in a dark room 72.3%

Text analysis

Amazon

1602
1602 o
o