Human Generated Data

Title

Girl Carrying Basket

Date

c. 1876-1878

People

Artist: Charles Herbert Moore, American 1840 - 1930

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Fine Arts Department, Harvard University, 1926.33.88

Human Generated Data

Title

Girl Carrying Basket

People

Artist: Charles Herbert Moore, American 1840 - 1930

Date

c. 1876-1878

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Fine Arts Department, Harvard University, 1926.33.88

Machine Generated Data

Tags

Amazon
created on 2020-04-25

Art 96.5
Person 95.7
Human 95.7
Clothing 82.1
Apparel 82.1
Drawing 81.3
Painting 65.4
Sketch 62.4

Clarifai
created on 2020-04-25

people 99.7
wear 99.5
one 99.3
veil 98.5
adult 98.3
portrait 98.2
lid 97.3
woman 97
scarf 96.8
child 96.6
outerwear 96.6
art 95.9
print 94.7
facial expression 94.4
bonnet 94.1
man 93.9
coat 93.7
two 89.6
footwear 88.7
lithograph 86.9

Imagga
created on 2020-04-25

person 30.4
man 30.2
people 25.7
male 25.5
adult 18.8
active 18
sport 17.2
black 17
snow 16.2
fun 15.7
happy 15.7
winter 15.3
boy 14.8
cold 13.8
portrait 13.6
attractive 13.3
smiling 13
dance 13
action 13
outdoors 12.7
jacket 12.5
lifestyle 12.3
fashion 12.1
dancer 11.6
guy 11.2
cute 10.8
jumping 10.6
jump 10.6
one 10.4
performer 10.4
motion 10.3
hat 10.2
child 10.2
posing 9.8
businessman 9.7
sky 9.7
season 9.4
bag 9.3
outdoor 9.2
skate 9.1
vertical 8.7
couple 8.7
dancing 8.7
play 8.6
smile 8.6
travel 8.4
joy 8.4
equipment 8.2
sexy 8
cool 8
weather 8
women 7.9
business 7.9
modern 7.7
outside 7.7
pretty 7.7
culture 7.7
expression 7.7
ice 7.6
casual 7.6
two 7.6
walk 7.6
movement 7.5
style 7.4
air 7.4
street 7.4
teen 7.3
teenager 7.3
exercise 7.3
pose 7.2
suit 7.2
happiness 7

Google
created on 2020-04-25

Microsoft
created on 2020-04-25

drawing 98.9
sketch 98.2
clothing 94.9
person 94.7
human face 90.4
cartoon 90.2
painting 81.4
text 61.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Male, 60.1%
Calm 41.5%
Angry 0.3%
Confused 0.6%
Fear 0.4%
Happy 0.3%
Surprised 0.1%
Sad 56.7%
Disgusted 0.1%

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.7%
Painting 65.4%

Categories

Imagga

paintings art 99.6%

Captions

Microsoft
created on 2020-04-25

a person wearing a costume 82.8%
a man wearing a costume 71.4%
a man wearing a hat 64.7%