Human Generated Data

Title

William Fiske Whitney (1850-1912)

Date

1927

People

Artist: Ernest Ludvig Ipsen, American 1869 - 1951

Sitter: William Fiske Whitney, 1850 - 1912

Classification

Paintings

Credit Line

Harvard University Portrait Collection, Gift of friends of Dr. Whitney to the Warren Museum, Harvard Medical School, 1927, H385

Human Generated Data

Title

William Fiske Whitney (1850-1912)

People

Artist: Ernest Ludvig Ipsen, American 1869 - 1951

Sitter: William Fiske Whitney, 1850 - 1912

Date

1927

Classification

Paintings

Credit Line

Harvard University Portrait Collection, Gift of friends of Dr. Whitney to the Warren Museum, Harvard Medical School, 1927, H385

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Art 100
Painting 100
Face 99.7
Head 99.7
Photography 99.7
Portrait 99.7
Baton 99.3
Stick 99.3
Person 97.8
Adult 97.8
Male 97.8
Man 97.8

Clarifai
created on 2018-05-09

people 99.6
one 99.5
portrait 99.3
painting 98.9
adult 98.7
man 98.4
print 97
wear 96.6
cape 96
gown (clothing) 95.3
facial hair 95.3
lid 94.1
leader 93.6
art 93.2
mustache 92.7
book series 92
religion 91.6
cap 91.6
indoors 90.8
priest 90.8

Imagga
created on 2023-10-07

suit 59.2
man 57.8
male 53.9
businessman 38
person 33.5
elevator 33.2
portrait 32.4
business 31.6
adult 29.6
handsome 26.7
lifting device 26.6
garment 25.9
people 25.7
black 24.7
clothing 24.3
professional 23.1
executive 22.4
device 20.8
face 20.6
covering 20.3
guy 20
looking 20
tie 19.9
men 19.8
expression 19.6
confident 19.1
work 17.3
office 17
corporate 16.3
manager 15.8
mature 15.8
old 14.6
alone 14.6
jacket 14.6
one 14.2
occupation 13.7
lifestyle 13.7
human 12.7
necktie 12.5
shirt 12.4
smiling 12.3
senior 12.2
hands 12.2
hand 12.2
smile 12.1
success 12.1
consumer goods 12.1
happy 11.9
head 11.8
serious 11.4
businesspeople 11.4
modern 11.2
businessperson 10.7
job 10.6
one person 10.4
hair 10.3
elegant 10.3
model 10.1
judge 9.9
fashion 9.8
attractive 9.8
elderly 9.6
dark 9.2
friendly 9.1
studio 9.1
eyes 8.6
formal 8.6
glasses 8.3
successful 8.2
look 7.9
gray hair 7.9
standing 7.8
masculine 7.8
40s 7.8
1 7.7
only 7.6
casual 7.6
holding 7.4
bow tie 7.3
pose 7.2
stylish 7.2
worker 7.2
posing 7.1
cool 7.1
working 7.1
happiness 7.1

Google
created on 2018-05-09

Microsoft
created on 2018-05-09

man 91.1
person 90.4
blackboard 77.9
picture frame 36.9
painting 20.2
staring 17.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 57-65
Gender Male, 99.9%
Calm 97.4%
Surprised 6.5%
Fear 5.9%
Sad 2.4%
Disgusted 0.4%
Confused 0.3%
Happy 0.2%
Angry 0.1%

Microsoft Cognitive Services

Age 54
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Baton 99.3%
Person 97.8%
Adult 97.8%
Male 97.8%
Man 97.8%