Human Generated Data

Title

Deaf Mute, New York, N.Y.

Date

1950, printed 1991

People

Artist: Louis Faurer, American 1916 - 2001

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.183

Copyright

© Estate of Louis Faurer

Human Generated Data

Title

Deaf Mute, New York, N.Y.

People

Artist: Louis Faurer, American 1916 - 2001

Date

1950, printed 1991

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Schneider/Erdman Printer's Proof Collection, partial gift, and partial purchase through the Margaret Fisher Fund, 2011.183

Copyright

© Estate of Louis Faurer

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Person 97.9
Human 97.9
Person 97.2
Finger 80.2
Face 67.4
Photography 61.5
Photo 61.5
Text 58.9
Performer 55.9
Advertisement 55.7

Clarifai
created on 2018-03-23

people 99.9
portrait 99.1
adult 98.5
one 98
monochrome 96.5
woman 93.7
man 92.5
music 91.4
profile 87.9
actor 86.7
actress 84.8
art 83.6
wear 83.3
musician 83.1
retro 83
leader 80.7
facial expression 78.9
administration 76.6
two 73.8
theater 71.7

Imagga
created on 2018-03-23

portrait 37.5
black 34.1
face 33.4
man 27.6
world 26.2
adult 25.9
person 25.6
hair 25.4
male 24.8
model 24.1
attractive 21.7
eyes 21.5
human 20.3
people 20.1
sexy 20.1
expression 19.6
close 18.3
fashion 18.1
skin 17.9
body 16
dark 15.9
head 15.1
sensual 14.6
covering 14
look 14
pretty 14
make 13.6
hairstyle 13.3
book jacket 13.3
makeup 13.2
lips 13
studio 12.9
one 12.7
jacket 12.3
cute 11.5
art 11.4
lady 11.4
looking 11.2
negative 11.2
military uniform 11.1
youth 11.1
emotion 11.1
uniform 10.4
style 10.4
film 10.3
smile 10
handsome 9.8
guy 9.6
brunette 9.6
nose 9.5
serious 9.5
women 9.5
sensuality 9.1
posing 8.9
clothing 8.8
hands 8.7
cosmetics 8.4
elegance 8.4
eye 8
love 7.9
wrapping 7.9
seductive 7.7
dream 7.6
happy 7.5
mouth 7.5
light 7.4
lifestyle 7.2
photographic paper 7.2

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

person 85.2
indoor 85.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 17-27
Gender Female, 98.5%
Angry 3%
Disgusted 2.7%
Happy 2.5%
Sad 7.8%
Calm 69.3%
Confused 12.1%
Surprised 2.6%

Microsoft Cognitive Services

Age 29
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.9%

Captions