Human Generated Data

Title

Two Girls on a Cliff

Date

c. 1883

People

Artist: Winslow Homer, American 1836 - 1910

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Mariana G. Van Rensselaer, 1934.122

Human Generated Data

Title

Two Girls on a Cliff

People

Artist: Winslow Homer, American 1836 - 1910

Date

c. 1883

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Mariana G. Van Rensselaer, 1934.122

Machine Generated Data

Tags

Amazon
created on 2020-04-25

Person 99.2
Human 99.2
Person 98.6
Art 96.2
Clothing 81.9
Apparel 81.9
Drawing 76.6
Person 68.2
Sketch 66.1
Painting 65.6
Portrait 57.8
Photography 57.8
Photo 57.8
Face 57.8

Clarifai
created on 2020-04-25

people 99.9
man 98.8
two 98.7
art 96.1
wear 95.8
child 94.6
adult 94.5
woman 93.6
sepia 91.2
interaction 90.4
three 88.8
family 88.3
sepia pigment 87.1
group 86.6
portrait 85.3
print 84
street 83.6
documentary 83.4
painting 83
one 81

Imagga
created on 2020-04-25

statue 34.5
architecture 25.8
sculpture 24.9
cleaner 23.5
building 20.8
old 19.5
pedestal 18.8
support 18.1
monument 17.7
art 16.9
swab 16.8
history 16.1
historical 16
city 15.8
tourism 15.7
ancient 15.6
seller 15
cleaning implement 14.7
wall 14.5
stone 14.5
religion 14.3
travel 13.4
dirty 12.6
structure 12.2
man 12.1
historic 11.9
landmark 10.8
mask 10.5
church 10.2
danger 10
tourist 10
antique 9.5
supporting structure 9.5
culture 9.4
industrial 9.1
facade 8.7
door 8.6
person 8.4
religious 8.4
famous 8.4
house 8.4
street 8.3
protection 8.2
destruction 7.8
marble 7.7
construction 7.7
musical instrument 7.7
window 7.3
memorial 7.3
detail 7.2
sconce 7.2
wooden 7

Google
created on 2020-04-25

Microsoft
created on 2020-04-25

drawing 99.5
text 99.3
sketch 98.7
painting 98.3
book 96.2
old 91.7
clothing 82.9
person 81
child art 52.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 13-25
Gender Female, 54.1%
Disgusted 45%
Sad 45%
Happy 45%
Calm 55%
Confused 45%
Angry 45%
Surprised 45%
Fear 45%

AWS Rekognition

Age 14-26
Gender Male, 53%
Calm 50.3%
Disgusted 45%
Sad 47%
Surprised 46.3%
Confused 45.1%
Angry 45.3%
Fear 45.7%
Happy 45.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Painting 65.6%

Categories

Captions

Microsoft
created on 2020-04-25

a vintage photo of a person 82.8%
an old photo of a person 82.7%
a vintage photo of a person 78.5%