Human Generated Data

Title

Elizabeth Greenleaf Parsons (1758-1829)

Date

1820

People

Artist: Sarah Goodridge, American 1788 - 1853

Previous attribution: Edward Greene Malbone, American 1777-1807

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mr. and Mrs. Robert Wayne Byerly, 1956.63

Human Generated Data

Title

Elizabeth Greenleaf Parsons (1758-1829)

People

Artist: Sarah Goodridge, American 1788 - 1853

Previous attribution: Edward Greene Malbone, American 1777-1807

Date

1820

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mr. and Mrs. Robert Wayne Byerly, 1956.63

Machine Generated Data

Tags

Amazon
created on 2019-04-05

Person 92.2
Human 92.2
Painting 88.5
Art 88.5
Pottery 82.5
Porcelain 82.5
Meal 61.5
Food 61.5
Window 55.4

Clarifai
created on 2018-04-19

people 99.9
one 99.8
adult 99.6
portrait 99.5
art 98.7
wear 97.7
veil 97.5
lid 97.1
painting 96.8
container 96.8
man 96.3
print 96.3
side view 95.1
woman 95
politician 93.1
old 91
mammal 90.4
museum 90.2
antique 89.7
profile 89.4

Imagga
created on 2018-04-19

frying pan 27.8
face 25.5
pan 24.5
hat 23.3
person 23.3
cooking utensil 21.8
portrait 21.3
adult 18.7
people 18.4
close 16
model 15.5
expression 15.3
hair 15
kitchen utensil 14.5
ventilator 14.5
washer 14.4
male 14.3
device 14
cute 13.6
fashion 13.6
child 13.5
man 13.4
attractive 13.3
gong 13.3
eyes 12.9
old 12.5
percussion instrument 12.5
facial 12.4
black 12
studio 11.4
boy 11.3
one 11.2
looking 11.2
white goods 10.8
closeup 10.8
childhood 10.7
smile 10.7
container 10.3
toilet seat 10.2
head 10.1
happy 10
vessel 9.9
kid 9.7
clothing 9.6
musical instrument 9.4
music 9
human 9
look 8.7
sombrero 8.7
women 8.7
smiling 8.7
youth 8.5
strainer 8.4
pretty 8.4
art 8.3
seat 8.2
children 8.2
pose 8.1
home appliance 8.1
handsome 8
utensil 7.8
filter 7.7
casual 7.6
hairstyle 7.6
gesture 7.6
hand 7.6
headdress 7.6
sound 7.5
fun 7.5
dollar 7.4
speaker 7.4
cash 7.3
lady 7.3

Google
created on 2018-04-19

photograph 94.8
portrait 74.4
dishware 70
plate 57.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 30-47
Gender Female, 83.7%
Surprised 10.9%
Calm 18.9%
Sad 15.8%
Disgusted 30.7%
Angry 7.4%
Happy 3%
Confused 13.2%

Microsoft Cognitive Services

Age 38
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 92.2%
Painting 88.5%

Categories

Imagga

Captions

Microsoft
created on 2018-04-19

a person sitting in a bowl 40.6%
a close up of a bowl 40.5%
a close up of a dog bowl 36.3%