Human Generated Data

Title

Walter Charles James and Charles Stewart Hardinge

Date

1829

People

Artist: Thomas Lawrence, English 1769 - 1830

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mrs. Jesse Isidor Straus in memory of her husband, Jesse Isidor Straus, Class of 1893, 1958.301

Human Generated Data

Title

Walter Charles James and Charles Stewart Hardinge

People

Artist: Thomas Lawrence, English 1769 - 1830

Date

1829

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mrs. Jesse Isidor Straus in memory of her husband, Jesse Isidor Straus, Class of 1893, 1958.301

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Painting 99.9
Art 99.9
Person 99.2
Human 99.2
Person 98.7

Clarifai
created on 2018-03-16

people 99.9
child 99.3
adult 98.9
painting 98.4
portrait 98.4
wear 97.9
two 97.2
son 97.2
woman 96.9
man 95.2
one 95
art 93.9
baby 93.8
group 93
music 92.3
girl 88.1
theater 86.5
affection 86.2
family 85.6
facial expression 84.4

Imagga
created on 2018-03-16

portrait 29.1
male 28.7
adult 28.2
people 26.8
person 26.6
happy 25.7
man 25.5
couple 25.3
attractive 23.8
together 21.9
love 21.3
sexy 17.7
lady 17
smiling 16.6
handsome 16
smile 15.7
dress 15.4
fashion 15.1
body 14.4
costume 14.3
posing 14.2
happiness 14.1
pretty 14
black 13.8
sitting 13.7
casual 13.6
fun 13.5
dark 13.4
model 13.2
looking 12.8
child 12.5
passion 12.2
face 12.1
elegant 12
mother 11.9
rustic 11.7
jeans 11.5
human 11.2
men 11.2
hair 11.1
two 11
sensuality 10.9
joy 10.9
bow tie 10.8
performer 10.8
father 10.4
style 10.4
friends 10.3
party 10.3
musical instrument 10.1
family 9.8
night 9.8
one 9.7
dancing 9.6
sofa 9.6
boy 9.6
women 9.5
wife 9.5
expression 9.4
cute 9.3
holiday 9.3
sensual 9.1
romance 8.9
romantic 8.9
erotic 8.8
lifestyle 8.7
clothing 8.7
maraca 8.6
son 8.6
sit 8.5
brother 8.5
relationship 8.4
studio 8.4
holding 8.2
suit 8.2
home 8
interior 8
brunette 7.8
standing 7.8
hugging 7.8
married 7.7
professional 7.5
hot 7.5
husband 7.5
outdoors 7.5
room 7.5
20s 7.3
makeup 7.3
dancer 7.3
teacher 7.3

Google
created on 2018-03-16

painting 93.6
lady 86.9
art 83
portrait 78
modern art 69.2
artwork 61.2
girl 51.5
stock photography 50.5

Microsoft
created on 2018-03-16

person 98
boy 97.7
indoor 88.6
young 87.7
painting 16.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-27
Gender Female, 96.9%
Disgusted 1.1%
Angry 2.8%
Calm 78.4%
Sad 12.2%
Happy 0.6%
Confused 2.4%
Surprised 2.5%

AWS Rekognition

Age 9-14
Gender Female, 99%
Confused 2.1%
Surprised 5.3%
Calm 81.7%
Happy 4%
Angry 0.9%
Sad 3%
Disgusted 3.1%

AWS Rekognition

Age 27-44
Gender Female, 84.6%
Calm 31%
Angry 5.4%
Disgusted 1.8%
Happy 32.3%
Confused 3.6%
Surprised 3.4%
Sad 22.5%

Microsoft Cognitive Services

Age 5
Gender Female

Microsoft Cognitive Services

Age 20
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting 99.9%
Person 99.2%

Categories