Human Generated Data

Title

Coat Check Woman, New York City

Date

c. 1982

People

Artist: N. Jay Jaffee, American 1921 - 1999

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Artist, P1998.104

Copyright

© The N. Jay Jaffee Trust. All rights reserved. Used by permission. www.njayjaffee.com

Human Generated Data

Title

Coat Check Woman, New York City

People

Artist: N. Jay Jaffee, American 1921 - 1999

Date

c. 1982

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 98.8
Person 98.8
Text 73.2
Face 68.8
Art 59.4
Man 55.1

Imagga
created on 2021-12-14

person 32
business 30.4
computer 27.1
people 26.2
office 25.9
attractive 25.9
laptop 25.3
lady 25.2
web site 23.4
adult 22.9
black 22.4
pretty 22.4
portrait 20.1
businesswoman 20
model 18.7
smile 18.5
professional 18.4
sexy 17.7
corporate 17.2
working 16.8
brunette 16.6
work 16.5
happy 16.3
man 16.1
executive 15.1
businessman 15
suit 14.8
smiling 14.5
studio 14.4
blackboard 14.4
fashion 14.3
sitting 13.8
career 13.3
friendly 12.8
job 12.4
cute 12.2
monitor 12.1
expression 11.9
television 11.9
gorgeous 11.8
face 11.4
desk 11.3
one 11.2
women 11.1
background 11
elegance 10.9
billboard 10.9
dark 10.9
holding 10.7
call 10.4
technology 10.4
style 10.4
hair 10.3
manager 10.2
screen 10.2
student 10
male 9.9
signboard 9.8
human 9.8
support 9.7
success 9.7
looking 9.6
body 9.6
lifestyle 9.4
youth 9.4
casual 9.3
modern 9.1
telephone 8.9
posing 8.9
notebook 8.9
clothing 8.8
assistant 8.8
secretary 8.6
seductive 8.6
confident 8.2
sensual 8.2
lovely 8
look 7.9
helpful 7.9
hands 7.8
elegant 7.7
tie 7.6
chair 7.6
erotic 7.6
communication 7.6
room 7.4
phone 7.4
emotion 7.4
electronic equipment 7.3
sensuality 7.3
dress 7.2
home 7.2
display 7.1
love 7.1
product 7.1
center 7

Microsoft
created on 2021-12-14

wall 96.8
text 94.9
monitor 94.4
person 91.7
television 86.7
black and white 76.8
gallery 76.2
clothing 74.2
man 73.6
human face 68.9
posing 49.4
room 40.1
screenshot 23.1
picture frame 16.9

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 25-39
Gender Female, 99.7%
Calm 80.2%
Sad 12.9%
Angry 2.9%
Confused 1.5%
Fear 0.9%
Happy 0.6%
Surprised 0.6%
Disgusted 0.4%

Microsoft Cognitive Services

Age 38
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%

Captions

Microsoft

a screen shot of a man 87.2%
a man standing in front of a flat screen television 51.5%
a man standing in front of a flat screen tv 51.4%

Text analysis

Amazon

father
May father
May