Human Generated Data

Title

Scenes from the Harvesting of Grapes at Mâcon: Women and Children Gathering Grapes

Date

19th century

People

Artist: Unidentified Artist,

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Paul J. Sachs and W. G. Russell Allen, 1938.98

Human Generated Data

Title

Scenes from the Harvesting of Grapes at Mâcon: Women and Children Gathering Grapes

People

Artist: Unidentified Artist,

Date

19th century

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Paul J. Sachs and W. G. Russell Allen, 1938.98

Machine Generated Data

Tags

Amazon
created on 2020-05-02

Art 95
Person 94.8
Human 94.8
Person 94.3
Painting 92.5
Person 90.6
Person 88.6
Person 80
Drawing 74
Sketch 58.9

Clarifai
created on 2020-05-02

people 99.9
illustration 99.6
group 99.5
print 99.3
adult 99.2
art 99.2
wear 97.5
man 97.1
many 95.6
weapon 93.9
leader 93.8
veil 93.7
administration 91.9
several 91.7
soldier 90.6
watercraft 90
lithograph 89.4
military 88.4
painting 87.5
engraving 87.3

Imagga
created on 2020-05-02

sketch 100
drawing 100
representation 82.4
grunge 35.7
old 34.1
vintage 29.8
antique 26.8
snow 26.3
aged 26.2
bicycle 24.5
retro 22.9
texture 21.5
ancient 19.9
wall 19.7
bike 19.5
graffito 19.2
decoration 18.5
dirty 18.1
old fashioned 18
obsolete 15.3
material 15.2
frame 15
cycle 14.6
decay 14.5
paper 14.1
grain 13.8
transport 13.7
aging 13.4
damaged 13.3
grungy 13.3
black 13.2
space 13.2
graphic 13.1
pattern 13
fracture 12.6
structure 12.4
empty 12
rough 11.8
art 11.8
grime 11.7
transportation 11.7
rusty 11.4
wheel 11.3
weather 11.2
cold 11.2
mottled 10.7
wallpaper 10.7
design 10.7
crumpled 10.7
textured 10.5
canvas 10.4
winter 10.2
tourist 10
paint 10
border 9.9
pedal 9.9
ride 9.7
parchment 9.6
weathered 9.5
historic 9.2
sport 9.1
city 9.1
landscape 8.9
surface 8.8
text 8.7
seat 8.6
window 8.4
decorative 8.3
backgrounds 8.1
man 8.1
metal 8
detail 8
urban 7.9
faded 7.8
travel 7.7
torn 7.7
blank 7.7
flower 7.7
rustic 7.7
spot 7.7
dirt 7.6
worn 7.6
house 7.5
silhouette 7.4
street 7.4
brown 7.4
effect 7.3
road 7.2
history 7.1
building 7.1
cool 7.1
rural 7
vehicle 7
architecture 7

Google
created on 2020-05-02

Microsoft
created on 2020-05-02

sketch 99.5
drawing 99.5
text 97.4
book 94.2
clothing 90.7
cartoon 89.2
person 87.9
dress 83.4
woman 83.2
illustration 80.9
painting 75.5
child art 53.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-33
Gender Male, 53.3%
Disgusted 45%
Calm 54%
Happy 45.1%
Fear 45.3%
Confused 45%
Sad 45.2%
Angry 45.4%
Surprised 45.1%

AWS Rekognition

Age 23-37
Gender Male, 53.8%
Angry 45%
Fear 45%
Surprised 45%
Happy 45%
Sad 45.2%
Confused 45%
Calm 54.7%
Disgusted 45%

AWS Rekognition

Age 22-34
Gender Female, 52.5%
Surprised 45%
Disgusted 45%
Calm 54.9%
Fear 45%
Sad 45%
Confused 45%
Angry 45%
Happy 45%

AWS Rekognition

Age 39-57
Gender Male, 54.6%
Calm 46.6%
Surprised 45.4%
Disgusted 45.3%
Happy 45%
Confused 49.8%
Angry 47.1%
Sad 45.5%
Fear 45.2%

Feature analysis

Amazon

Person 94.8%
Painting 92.5%

Categories