Human Generated Data

Title

Herbert Weir Smyth (1857-1937), replica

Date

1931

People

Artist: Elizabeth Piutti-Barth, American 1872 - 1959

Sitter: Herbert Weir Smyth, 1857 - 1937

Classification

Paintings

Credit Line

Harvard University Portrait Collection, Commissioned by Harvard University, H393B

Human Generated Data

Title

Herbert Weir Smyth (1857-1937), replica

People

Artist: Elizabeth Piutti-Barth, American 1872 - 1959

Sitter: Herbert Weir Smyth, 1857 - 1937

Date

1931

Classification

Paintings

Credit Line

Harvard University Portrait Collection, Commissioned by Harvard University, H393B

Machine Generated Data

Tags

Amazon
created on 2019-11-09

Art 98.1
Painting 97.2
Human 96.5
Person 96.5

Clarifai
created on 2019-11-09

portrait 99.9
people 99.9
one 99.7
adult 99.6
painting 98.2
man 97.7
wear 97.1
art 96.6
elderly 96
facial expression 93.3
indoors 93.2
old 92.6
leader 91.8
confidence 88.5
vintage 86.8
chair 86.2
mustache 84.9
window 83.5
side view 83.2
jacket 82

Imagga
created on 2019-11-09

elevator 100
lifting device 100
device 78.8
man 51.1
male 44
portrait 32.3
senior 29
businessman 28.2
people 25.6
adult 24.6
business 24.3
person 23.4
suit 22.7
old 22.3
handsome 21.4
happy 21.3
elderly 21.1
grandfather 19
tie 19
executive 18.8
mature 18.6
success 17.7
looking 16.8
smile 16.4
face 16.3
men 16.3
manager 15.8
corporate 15.5
couple 14.8
necktie 14.8
smiling 14.5
black 14.4
professional 14.3
glasses 13.9
expression 13.6
office 12.8
bow tie 12.8
confident 12.7
work 12.6
standing 12.2
hair 11.9
husband 11.4
together 11.4
shirt 11.2
eyes 11.2
successful 11
home 10.4
sitting 10.3
clothing 9.8
grandmother 9.8
family 9.8
one 9.7
boss 9.6
age 9.5
wife 9.5
love 9.5
aged 9
gray 9
working 8.8
jacket 8.8
older 8.7
lifestyle 8.7
retirement 8.6
close 8.6
head 8.4
attractive 8.4
occupation 8.2
lady 8.1
happiness 7.8
hands 7.8
40s 7.8
retired 7.8
modern 7.7
casual 7.6
adults 7.6
holding 7.4
guy 7.4
friendly 7.3
laptop 7.3
computer 7.2
look 7

Google
created on 2019-11-09

Microsoft
created on 2019-11-09

art 98.2
human face 95
text 94.4
indoor 93.7
person 92.6
drawing 92.3
gallery 87.2
room 84
portrait 81.2
scene 80.6
man 80.1
clothing 71.2
museum 59.2
picture frame 45.4
painting 30.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 51-69
Gender Male, 99.4%
Fear 0%
Angry 0.1%
Disgusted 0%
Calm 99.3%
Happy 0%
Confused 0.1%
Sad 0.2%
Surprised 0.2%

Microsoft Cognitive Services

Age 65
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 97.2%
Person 96.5%