Human Generated Data

Title

Charles Sprague Sargent (1841-1927) (reading a book)

Date

c. 1903

People

Artist: Sarah Choate Sears, American 1858 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Montgomery S. Bradley and Cameron Bradley, P1984.53

Human Generated Data

Title

Charles Sprague Sargent (1841-1927) (reading a book)

People

Artist: Sarah Choate Sears, American 1858 - 1935

Date

c. 1903

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 97.5
Human 97.5
Art 93.3
Painting 90.4
Reading 56.1

Imagga
created on 2022-01-23

man 45
male 41.9
black 41.3
person 39.2
adult 36.1
portrait 30.4
people 27.9
handsome 24.1
dark 23.4
studio 21.3
senior 20.6
face 19.9
men 18.9
serious 18.1
expression 17.9
businessman 17.7
sexy 16.9
hair 16.6
human 16.5
one 16.4
model 16.3
body 16
suit 15.5
professional 14.7
attractive 14.7
looking 14.4
elderly 14.4
manager 14
old 13.9
guy 13.6
muscular 13.4
mature 13
grandfather 13
hand 12.7
fitness 12.7
executive 12.5
strong 12.2
glasses 12
fit 12
health 11.8
older 11.7
naked 11.6
lifestyle 11.6
business 11.5
sport 11
skin 11
confident 10.9
torso 10.7
boss 10.5
strength 10.3
work 10.2
training 10.2
active 10.1
power 10.1
success 9.7
muscle 9.6
hands 9.6
formal 9.6
healthy 9.5
exercise 9.1
pose 9.1
fashion 9
look 8.8
smart 8.5
athlete 8.4
happy 8.1
posing 8
to 8
job 8
planner 7.9
biceps 7.9
couple 7.8
masculine 7.8
light 7.8
nude 7.8
modern 7.7
athletic 7.7
gym 7.7
art 7.6
thinking 7.6
arm 7.6
scholar 7.6
bodybuilder 7.6
erotic 7.6
microphone 7.5
shirt 7.5
silhouette 7.5
close 7.4
emotion 7.4
love 7.1
macho 7

Google
created on 2022-01-23

Coat 92.5
Gesture 85.2
Suit 83.5
Tie 83.4
Art 79.1
Collar 76.7
Blazer 72.9
Sleeve 71.3
Vintage clothing 70.1
Self-portrait 68.3
Stock photography 65.7
Visual arts 65.2
Formal wear 64.1
Publication 62.4
History 58.7
Book 58.2
Illustration 57.8
Monochrome photography 57.6
Artist 56.9
Sitting 56.7

Microsoft
created on 2022-01-23

text 99.3
person 97.7
man 95
human face 93.7
painting 89.5
portrait 60.9
old 58.7
clothing 56.6
book 53.3
posing 41.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 56-64
Gender Male, 100%
Calm 97.2%
Sad 2.6%
Confused 0.1%
Disgusted 0%
Angry 0%
Happy 0%
Surprised 0%
Fear 0%

Microsoft Cognitive Services

Age 70
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.5%

Captions

Microsoft

a vintage photo of a man 84.8%
an old photo of a man 83%
a vintage photo of a man in a suit and tie 82.9%