Human Generated Data

Title

James Bradstreet Greenough (1833-1901)

Date

1899

People

Artist: William H. Longmaid, British active 1886-1909

Sitter: James Bradstreet Greenough, 1833 - 1901

Classification

Paintings

Human Generated Data

Title

James Bradstreet Greenough (1833-1901)

People

Artist: William H. Longmaid, British active 1886-1909

Sitter: James Bradstreet Greenough, 1833 - 1901

Date

1899

Classification

Paintings

Machine Generated Data

Tags

Amazon

Art 97.5
Painting 97.5
Person 95.4
Human 95.4
Accessories 76.6
Accessory 76.6
Tie 76.6

Clarifai

people 99.9
portrait 99
one 98.7
man 98.1
adult 97.5
leader 96.8
mustache 95.6
wear 93.8
outfit 93.4
administration 88.9
facial hair 86.9
furniture 86.9
sit 84.9
business 83.1
chair 82
menswear 81.7
painting 78.3
book series 76.7
music 70.9
lid 70.9

Imagga

suit 78.1
businessman 61.9
man 56.5
male 53.3
business 48.6
person 46.1
executive 45.7
handsome 39.3
corporate 35.3
professional 33.7
adult 33.1
portrait 32.4
tie 32.3
men 30.1
confident 30.1
waiter 29.5
garment 29
employee 27.7
necktie 27.3
office 26.5
manager 26.1
people 24.6
dining-room attendant 23.7
speaker 22.8
clothing 22.5
worker 22.2
expression 22.2
successful 22
looking 21.6
face 21.3
black 20.5
success 20.1
mature 19.5
guy 18.9
articulator 18.6
smiling 17.4
serious 17.2
bow tie 17
smile 16.4
baron 16.4
happy 16.3
businesspeople 16.1
working 15.9
work 15.7
standing 14.8
sitting 14.6
boss 14.4
communicator 14.2
one 14.2
shirt 14
hand 13.7
modern 13.3
judge 13.1
lifestyle 13
job 11.5
formal 11.5
senior 11.3
elegant 11.1
glasses 11.1
jacket 10.9
businessmen 10.7
look 10.5
human 10.5
old 10.5
thinking 10.5
consumer goods 10.3
covering 10.2
occupation 10.1
communication 10.1
friendly 10.1
studio 9.9
cheerful 9.8
indoors 9.7
confidence 9.6
hair 9.5
one person 9.4
happiness 9.4
good 9.4
alone 9.1
laptop 9.1
computer 8.8
businessperson 8.8
leadership 8.7
elderly 8.6
model 8.6
career 8.5
smart 8.5
casual 8.5
attractive 8.4
posing 8
colleagues 7.8
expertise 7.8
corporation 7.7
collar 7.7
desk 7.6
company 7.4
holding 7.4
color 7.2
building 7.1

Google

Microsoft

person 99.2
man 96.9
tie 95.9
clothing 95.7
suit 95
human face 94.4
text 86.7
portrait 64.3
gentleman 59.5
shirt 59.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 47-65
Gender Male, 98.8%
Angry 46.2%
Disgusted 0.5%
Surprised 0.9%
Fear 0.4%
Happy 1.6%
Confused 5.9%
Calm 35.5%
Sad 9%

Microsoft Cognitive Services

Age 67
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.4%
Tie 76.6%

Captions

Microsoft

a man wearing a suit and tie 94.3%
a man in a suit and tie 94.2%
a man standing in front of a building 75.2%

Text analysis

Amazon

YGn