Human Generated Data

Title

Untitled (man with cigarette)

Date

1920s

People

Artist: Strauss-Peyton, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1925

Human Generated Data

Title

Untitled (man with cigarette)

People

Artist: Strauss-Peyton, American 20th century

Date

1920s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Human 97.3
Person 97.3
Finger 92.4
Smoke 86.1
Smoking 70.6
Face 63.2

Imagga
created on 2021-12-15

portrait 45.4
person 42
face 35.6
lipstick 32.1
attractive 30.9
makeup 30.2
adult 29.9
phone 26.8
model 26.5
fashion 25.7
pretty 25.3
people 25.2
call 23.8
toiletry 23.3
cosmetic 22
sexy 21.7
smile 20.7
adolescent 20.4
smiling 20.3
happy 20.1
hair 19.9
lady 19.5
cute 19.4
brunette 19.2
businesswoman 19.1
business 18.9
telephone 18.4
make 17.3
expression 17.1
style 17.1
office 16.9
juvenile 16.5
black 16.4
communication 16
studio 16
women 15.9
hand 15.2
elegance 15.2
human 15
eyes 14.7
professional 14.5
elegant 13.7
suit 13.1
lips 13
talking 12.4
looking 12
close 12
one 12
modern 11.9
sensuality 11.8
dress 11.8
lifestyle 11.6
cheerful 11.4
mobile 11.3
clothing 11.1
work 11
confident 10.9
gorgeous 10.9
talk 10.6
corporate 10.3
man 10.1
friendly 10.1
fresh 9.8
working 9.7
success 9.7
look 9.7
communicate 9.6
support 9.5
hair spray 9.4
shirt 9.4
male 9.2
nice 9.2
lovely 8.9
posing 8.9
happiness 8.6
businesspeople 8.6
laughing 8.5
dark 8.4
20s 8.3
blond 8.3
job 8
businessman 8
color 7.8
cellphone 7.8
conversation 7.8
cell 7.7
health 7.7
serious 7.6
casual 7.6
thinking 7.6
head 7.6
executive 7.6
feminine 7.5
service 7.4
natural 7.4
wedding 7.4
long 7.4
stylish 7.3
worker 7.2
eye 7.2
handsome 7.1
secretary 7.1
bride 7

Google
created on 2021-12-15

Forehead 98.5
Face 98.4
Nose 98.4
Lip 97
Chin 96.7
Eyebrow 94.6
Eyelash 92.1
Jaw 88.2
Neck 87.5
Flash photography 87.1
Gesture 85.3
Style 83.8
Tie 77.9
Monochrome photography 74.9
Monochrome 72.1
Formal wear 71.4
Vintage clothing 69.6
Room 64.9
White-collar worker 64.5
Eyewear 63.5

Microsoft
created on 2021-12-15

wall 98.9
person 97.3
human face 94.8
indoor 93.9
portrait 86.8
text 86.7
man 70.9
posing 47.5
picture frame 14.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-30
Gender Male, 99.1%
Calm 98.8%
Angry 0.5%
Sad 0.3%
Confused 0.1%
Surprised 0.1%
Happy 0%
Fear 0%
Disgusted 0%

Microsoft Cognitive Services

Age 31
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.3%

Captions

Microsoft

a person standing in front of a mirror posing for the camera 74.6%
a person in front of a mirror posing for the camera 74.5%
a person taking a selfie in front of a mirror posing for the camera 64.8%

Text analysis

Amazon

YORK
SR YORK
SR

Google

NEW
ain
hauol
TOBE
hauol ain NEW TOBE