Human Generated Data

Title

Scenes from the Harvesting of Grapes at Mâcon: Woman and Two Children

Date

19th century

People

Artist: Unidentified Artist,

Classification

Drawings

Human Generated Data

Title

Scenes from the Harvesting of Grapes at Mâcon: Woman and Two Children

People

Artist: Unidentified Artist,

Date

19th century

Classification

Drawings

Machine Generated Data

Tags

Amazon

Human 99.1
Person 99
Drawing 98.5
Art 98.5
Sketch 97.2
Person 95.4
Person 95.4
Person 84.2
Person 68.2
Person 43.8

Clarifai

people 100
adult 99.5
group 99.1
art 99
wear 98.8
print 97.8
illustration 97.6
man 97.4
two 96.6
leader 95.3
one 94.4
administration 93.8
veil 93.6
many 92.7
group together 92.2
outfit 92.1
several 91.8
weapon 90.2
soldier 89.7
military 88.9

Imagga

graffito 100
decoration 82.8
grunge 43.4
sketch 42.2
drawing 39.6
vintage 38
old 37.6
aged 35.3
antique 33.7
texture 33.3
retro 31.1
paper 28.2
ancient 27.7
representation 25.3
material 24.1
damaged 22.9
frame 22.5
design 20.3
decay 20.3
pattern 19.8
parchment 19.2
old fashioned 19
dirty 19
art 18.9
grime 17.6
wallpaper 16.8
fracture 16.5
stains 16.5
wall 16.2
decorative 15.9
grain 15.7
worn 15.3
ragged 14.6
crumpled 14.6
paint 14.5
border 14.5
floral 14.5
grungy 14.2
page 13.9
graphic 13.9
flower 13.8
space 13.2
textured 13.1
structure 13.1
rough 12.8
text 12.2
ornament 12.1
mottled 11.7
empty 11.2
blank 11.1
historic 11
distressed 10.8
manuscript 10.8
stain 10.6
rusty 10.5
faded 9.7
crack 9.7
detailed 9.6
obsolete 9.6
brown 9.6
textures 9.5
card 9.4
tattered 8.9
scratch 8.8
artistic 8.7
water 8.7
aging 8.6
leaf 8.6
weathered 8.5
document 8.3
color 8.3
backdrop 8.2
stamp 8.1
detail 8
shabby 7.8
scratched 7.8
your 7.7
edge 7.7
stained 7.7
spot 7.7
dirt 7.6
canvas 7.6
letter 7.3
artwork 7.3
ornate 7.3
book 7.3
backgrounds 7.3
history 7.2
surface 7.1

Google

Drawing 84.7
Art 77.2
Illustration 76.8
Sketch 67.9
Painting 60.3
Artwork 53.4
Style 51

Microsoft

drawing 99.8
sketch 99.8
cartoon 94.1
text 90.9
illustration 90.2
child art 89.5
old 68.1
white 65.3
black and white 50.9
art 50.8
gallery 50.6
vintage 30.6
stone 3.6

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Female, 52.9%
Disgusted 45.9%
Sad 46%
Surprised 45.5%
Fear 46.2%
Happy 45.1%
Angry 45.9%
Calm 50.1%
Confused 45.1%

AWS Rekognition

Age 17-29
Gender Female, 50.3%
Fear 49.5%
Surprised 49.5%
Angry 49.5%
Disgusted 49.5%
Confused 49.5%
Calm 50.3%
Happy 49.5%
Sad 49.6%

AWS Rekognition

Age 24-38
Gender Female, 50.4%
Happy 45%
Calm 45.1%
Disgusted 45%
Confused 45%
Surprised 53.1%
Fear 46.8%
Sad 45%
Angry 45%

AWS Rekognition

Age 30-46
Gender Male, 51.9%
Sad 45.2%
Surprised 45.3%
Angry 45.1%
Calm 54.1%
Happy 45.2%
Fear 45.1%
Confused 45%
Disgusted 45%

AWS Rekognition

Age 22-34
Gender Female, 50.4%
Angry 49.5%
Surprised 49.5%
Calm 50.5%
Sad 49.5%
Fear 49.5%
Happy 49.5%
Confused 49.5%
Disgusted 49.5%

Feature analysis

Amazon

Person 99%

Captions

Microsoft

a vintage photo of a person 78.7%
a vintage photo of some people 76.4%
a vintage photo of a person 73.1%