Human Generated Data

Title

William James (1771-1832)

Date

1822

People

Artist: Ezra Ames, American 1768 - 1836

Sitter: William James, 1771 - 1832

Classification

Paintings

Human Generated Data

Title

William James (1771-1832)

People

Artist: Ezra Ames, American 1768 - 1836

Sitter: William James, 1771 - 1832

Date

1822

Classification

Paintings

Machine Generated Data

Tags

Amazon

Human 98.3
Person 98.3
Apparel 80.2
Clothing 80.2
Text 79.2
Art 73.9
Painting 67.7
Money 61.7
Overcoat 61.3
Coat 61.3
Jaw 58.1

Clarifai

people 99.9
one 99.8
portrait 99.6
adult 99.4
leader 98.3
man 98.2
administration 98.2
wear 95.4
sit 94.2
outfit 92.3
chair 91.4
two 90
politician 88.7
music 88.4
furniture 87.2
scientist 87
writer 84.9
art 84.1
military 81.5
actor 81

Imagga

suit 47.2
businessman 44.2
man 41.7
male 39.1
business 38.9
person 36.6
professional 33.8
adult 33.6
office 32.5
corporate 30.1
people 28.5
handsome 27.6
executive 26.5
groom 26
portrait 25.9
happy 22.6
businesswoman 21.8
manager 21.4
work 21.2
necktie 20.9
looking 20.8
men 20.6
job 20.4
confident 20
bow tie 19.8
adolescent 18.7
formal 18.2
success 17.7
smile 17.1
businesspeople 17.1
tie 17.1
meeting 17
clothing 16.7
boss 16.3
modern 16.1
attractive 16.1
building 15.9
lifestyle 15.9
juvenile 15.8
couple 15.7
standing 15.7
successful 15.6
worker 15.5
garment 15.5
serious 15.3
working 15
company 14.9
cheerful 14.6
smiling 14.5
career 14.2
holding 14
sitting 13.7
corporation 13.5
face 13.5
one 13.4
team 13.4
happiness 12.5
call 12.4
boy 12.2
group 12.1
fashion 12.1
human 12
expression 11.9
child 11.8
communication 11.8
black 11.6
employee 11.6
computer 11.2
teamwork 11.1
occupation 11
laptop 10.9
guy 10.9
leadership 10.6
love 10.3
two 10.2
alone 10
jacket 9.8
colleagues 9.7
workplace 9.5
smart 9.4
casual 9.3
friendly 9.2
indoor 9.1
hand 9.1
businessmen 8.8
together 8.8
two people 8.8
partnership 8.6
secretary 8.5
elegance 8.4
pretty 8.4
waiter 8.4
dress 8.1
family 8
home 8
brother 7.9
model 7.8
old 7.7
confidence 7.7
outdoors 7.5
20s 7.3
teenager 7.3
color 7.2
hair 7.1
women 7.1
room 7.1
elevator 7.1
life 7.1
indoors 7

Microsoft

person 99.5
clothing 97
suit 96.5
human face 95.2
text 94.5
man 93.5
black and white 91
tie 69.5
black 67.5
portrait 62.9

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 32-48
Gender Male, 87.5%
Fear 0.1%
Confused 0.3%
Calm 96.8%
Sad 0.3%
Happy 0.4%
Angry 1.2%
Disgusted 0.5%
Surprised 0.4%

Microsoft Cognitive Services

Age 38
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%
Painting 67.7%

Captions

Microsoft

a black and white photo of a man 90.3%
a man sitting on a bench posing for the camera 82.5%
a man sitting on a bench 79.7%