Human Generated Data

Title

Domestic Scene with Woman feeding Cat

Date

19th century

People

Artist: Joseph Gear, British

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, 1898.607

Human Generated Data

Title

Domestic Scene with Woman feeding Cat

People

Artist: Joseph Gear, British

Date

19th century

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Belinda L. Randall from the collection of John Witt Randall, 1898.607

Machine Generated Data

Tags

Amazon
created on 2019-05-31

Person 98.8
Human 98.8
Person 98.5
Art 98.3
Painting 98.3
Person 90
Drawing 87.6
Sketch 69.2
Photo 58.3
Photography 58.3
Portrait 58.3
Face 58.3

Clarifai
created on 2019-05-31

people 98.6
adult 98.2
painting 97.8
family 97.4
man 96.1
woman 95.3
art 95
two 93.5
baby 93.3
room 93.1
son 92.9
illustration 92.7
child 91.7
girl 89.5
love 87.2
portrait 87.1
house 86.8
gown (clothing) 86.2
sit 86.1
bathrobe 85

Imagga
created on 2019-05-31

sketch 100
drawing 95.9
representation 82.5
home 26.3
happy 25
man 24.2
people 22.9
adult 22
portrait 21.3
smiling 20.2
happiness 19.6
couple 19.1
person 18.9
love 18.1
couch 17.4
smile 17.1
room 16.6
family 16
male 15.7
mother 15.2
house 15
face 14.9
indoors 14
sitting 13.7
fun 13.5
sofa 13.4
interior 13.3
old 13.2
child 13.2
together 13.1
senior 13.1
looking 12.8
pretty 11.9
lifestyle 11.6
care 11.5
elderly 10.5
husband 10.5
attractive 10.5
expression 10.2
casual 10.2
cute 10
joy 10
kid 9.7
health 9.7
sexy 9.6
hair 9.5
women 9.5
bed 9.5
wife 9.5
clothing 9.1
children 9.1
patient 9.1
cheerful 8.9
father 8.7
hug 8.7
life 8.6
bright 8.6
mature 8.4
sculpture 8.3
fashion 8.3
human 8.2
hospital 8.1
medical 7.9
look 7.9
two 7.6
living 7.6
females 7.6
horizontal 7.5
style 7.4
parent 7.4
art 7.4
playing 7.3
dress 7.2
holiday 7.2
childhood 7.2

Google
created on 2019-05-31

Painting 85.4
Illustration 80.2
Art 73.1
Fictional character 52.8
Child 52
Drawing 51.1
Mother 50.8

Microsoft
created on 2019-05-31

drawing 96.1
sketch 93.7
person 93.5
painting 92
text 91.6
indoor 88.5
cartoon 83.9
human face 78.5
old 65.9
older 59
baby 55.3
clothing 51.4
posing 49.7
family 23.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-7
Gender Female, 95.3%
Happy 1.6%
Disgusted 2.6%
Angry 1.4%
Surprised 1%
Sad 91%
Calm 0.9%
Confused 1.5%

AWS Rekognition

Age 26-43
Gender Female, 85.7%
Sad 93.4%
Angry 0.6%
Happy 0.5%
Confused 1%
Calm 3.8%
Surprised 0.3%
Disgusted 0.4%

AWS Rekognition

Age 26-43
Gender Female, 74.4%
Confused 0.8%
Sad 0.2%
Calm 0.2%
Happy 97.6%
Surprised 0.6%
Disgusted 0.2%
Angry 0.3%

Microsoft Cognitive Services

Age 6
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Painting 98.3%

Categories

Imagga

paintings art 97.7%
people portraits 2.1%