Human Generated Data

Title

Scenes from the Harvesting of Grapes at Mâcon: Women Gathering Grapes

Date

19th century

People

Artist: Unidentified Artist,

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Paul J. Sachs and W. G. Russell Allen, 1938.105

Human Generated Data

Title

Scenes from the Harvesting of Grapes at Mâcon: Women Gathering Grapes

People

Artist: Unidentified Artist,

Date

19th century

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Paul J. Sachs and W. G. Russell Allen, 1938.105

Machine Generated Data

Tags

Amazon
created on 2020-05-02

Human 97.4
Art 96
Drawing 95.7
Person 93.5
Person 92.9
Sketch 92.3
Person 91.7
Painting 79.9
Person 51.4
Person 48.1

Clarifai
created on 2020-05-02

people 99.9
group 99.8
print 99.5
art 99
adult 98.9
wear 98.5
many 97.8
man 97.7
several 96.5
leader 96
illustration 95.9
weapon 95
engraving 94.8
two 94.4
administration 94.3
veil 94.2
soldier 92.7
military 91.9
group together 91.2
painting 89.8

Imagga
created on 2020-05-02

graffito 100
decoration 86.1
grunge 40
sketch 37.7
old 36.2
vintage 33.9
drawing 32.4
antique 30.3
texture 29.2
aged 27.1
retro 24.6
wall 24.1
dirty 23.5
ancient 23.3
representation 21.4
material 19.6
damaged 19.1
grungy 19
rough 18.2
old fashioned 18.1
pattern 17.8
surface 17.6
textured 16.6
frame 16.6
design 16.3
empty 16.3
art 15.6
obsolete 15.3
decorative 15
paper 14.9
paint 14.5
decay 14.5
space 14
grain 13.8
structure 13.7
worn 13.4
rusty 13.3
detail 12.9
fracture 12.6
crumpled 12.6
parchment 12.5
aging 12.5
wallpaper 12.2
brown 11.8
grime 11.7
graphic 11.7
weathered 11.4
canvas 11.4
artistic 11.3
blank 11.1
border 10.8
stains 10.7
backgrounds 10.5
mottled 9.7
torn 9.7
page 9.3
color 8.9
burnt 8.7
text 8.7
building 8.7
messy 8.7
stained 8.6
stain 8.6
dirt 8.6
architecture 8.6
effect 8.2
gray 8.1
water 8
black 7.8
ragged 7.8
faded 7.8
snow 7.8
cold 7.7
detailed 7.7
spot 7.7
concrete 7.7
door 7.6
textures 7.6
window 7.5
exterior 7.4

Google
created on 2020-05-02

Text 85.2
Art 75.5
Drawing 70.3
Illustration 68.1
History 57.6
Painting 57.1
Visual arts 55

Microsoft
created on 2020-05-02

drawing 99.8
sketch 99.8
text 94.6
person 83.7
cartoon 83.4
child art 82.2
illustration 80.4
clothing 74.3
painting 53.5
old 48.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-42
Gender Male, 54.7%
Confused 45.4%
Surprised 45.1%
Happy 45%
Calm 48.8%
Sad 47.1%
Fear 46.9%
Angry 46.5%
Disgusted 45.2%

AWS Rekognition

Age 23-35
Gender Male, 51%
Calm 54.4%
Angry 45.1%
Confused 45%
Happy 45.4%
Sad 45.1%
Disgusted 45%
Fear 45%
Surprised 45.1%

AWS Rekognition

Age 22-34
Gender Female, 52.1%
Fear 45%
Calm 45%
Happy 45%
Surprised 45%
Disgusted 45%
Confused 45%
Sad 55%
Angry 45%

Feature analysis

Amazon

Person 93.5%
Painting 79.9%

Categories

Imagga

paintings art 97.4%
nature landscape 2.4%

Captions

Microsoft
created on 2020-05-02

an old photo of a person 73.8%
old photo of a person 68.8%
an old photo of a person 68.7%