Human Generated Data

Title

Portrait of a Woman Wearing Red and Black

Date

19th-20th century

People

Artist: Denman Waldo Ross, American 1853 - 1935

Classification

Paintings

Human Generated Data

Title

Portrait of a Woman Wearing Red and Black

People

Artist: Denman Waldo Ross, American 1853 - 1935

Date

19th-20th century

Classification

Paintings

Machine Generated Data

Tags

Amazon

Art 90.5
Painting 89.2
Person 80.2
Human 80.2
Leisure Activities 70.5
Dance Pose 66.9
Dance 59.8
Performer 57.9

Clarifai

people 99.5
jacket 98.7
adult 98.2
one 97.9
portrait 97.8
wear 97.5
lid 96.5
winter 96.4
coat 95.5
woman 93.7
painting 93.5
retro 92
scarf 91.6
outerwear 89.3
person 89.1
vintage 87.4
art 86.1
brunette 84.9
cold 84.4
costume 84.1

Imagga

costume 37.7
portrait 29.8
man 26.2
person 26.1
stocking 25.6
people 25.1
attractive 23.8
cute 22.2
adult 21.6
face 21.3
male 20.7
clothing 20.7
happy 20
hair 19
garment 18.7
hosiery 17.9
lady 17.8
sexy 17.7
pretty 17.5
hat 17.4
fur coat 17.1
smile 16.4
coat 15.8
fashion 15.1
fur 14.9
scarf 13.6
handsome 13.4
brunette 13.1
holiday 12.2
footwear 11.8
model 11.7
studio 11.4
smiling 10.8
traditional 10.8
eyes 10.3
sitting 10.3
love 10.3
youth 10.2
teenager 10
human 9.7
one 9.7
beard 9.7
look 9.6
father 9.6
women 9.5
expression 9.4
winter 9.4
covering 9.3
glasses 9.3
lips 9.3
makeup 9.1
sensual 9.1
black 9
fun 9
posing 8.9
couple 8.7
casual 8.5
head 8.4
child 8.3
teen 8.3
holding 8.3
fabric 8.2
dress 8.1
family 8
home 8
celebration 8
together 7.9
seasonal 7.9
happiness 7.8
boy 7.8
feather boa 7.7
suit 7.7
old 7.7
wearing 7.6
close 7.4
style 7.4
tartan 7.4
blond 7.4
emotion 7.4
cheerful 7.3
playing 7.3
make 7.3
lifestyle 7.2
looking 7.2
cap 7.2
romance 7.1

Google

Painting 96.4
Portrait 87.5
Art 87.2
Modern art 77.9
Visual arts 76.9
Self-portrait 64.6
Artist 56.3

Microsoft

painting 99.8
drawing 99.7
sketch 98.8
red 96.2
human face 95.3
art 92.6
person 90.4
child art 88.7
wearing 88.2
clothing 79.2
portrait 77.5
woman 69.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-45
Gender Male, 66.6%
Disgusted 5.1%
Happy 1.5%
Sad 7%
Calm 70.9%
Angry 5.7%
Surprised 3.1%
Confused 6.7%

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 89.2%
Person 80.2%

Captions

Microsoft

a person wearing a red hat 74.1%
a person wearing a red and white hat 68.1%
a person wearing a red and white stuffed animal 47.5%