Human Generated Data

Title

Scenes from the Harvesting of Grapes at Mâcon: Couples Dancing

Date

19th century

People

Artist: Unidentified Artist,

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Paul J. Sachs and W. G. Russell Allen, 1938.99

Human Generated Data

Title

Scenes from the Harvesting of Grapes at Mâcon: Couples Dancing

People

Artist: Unidentified Artist,

Date

19th century

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Paul J. Sachs and W. G. Russell Allen, 1938.99

Machine Generated Data

Tags

Amazon
created on 2020-05-03

Human 99
Person 99
Person 98.5
Person 97.3
Art 95.9
Drawing 95.9
Person 94.4
Person 93.8
Person 92.7
Person 91.4
Sketch 89.5
Painting 64.9
Person 64.4
Person 60.4
Text 58.3
Duel 56.2

Clarifai
created on 2020-05-03

people 100
group 99.8
adult 99.2
man 98.6
many 98.5
print 98.3
wear 98
leader 97.5
art 97.4
administration 96.5
several 95.4
military 93.6
weapon 93.4
veil 92.1
engraving 91.9
soldier 91.2
illustration 91.2
outfit 90.7
uniform 88.2
group together 87.3

Imagga
created on 2020-05-03

sketch 96.9
drawing 81.4
representation 57
vintage 35.5
grunge 34.9
old 32
antique 27.8
texture 25.7
ancient 25.1
aged 24.4
paper 21.9
retro 21.3
art 19.9
material 17
decoration 16.3
old fashioned 16.2
grain 14.8
flower 14.6
graphic 14.6
dirty 14.4
wallpaper 13.8
grime 13.7
decay 13.5
stain 13.4
obsolete 13.4
structure 13.2
textured 13.1
artistic 13
fracture 12.6
mottled 11.7
decorative 11.7
frame 11.6
history 11.6
worn 11.4
design 11.4
grungy 11.4
floral 11
travel 10.6
damaged 10.5
text 10.5
brown 10.3
graffito 10
effect 10
rough 10
water 10
painterly 9.9
smudged 9.8
style 9.6
edge 9.6
parchment 9.6
aging 9.6
pattern 9.6
wall 9.5
empty 9.4
letter 9.2
silhouette 9.1
religion 9
vase 8.8
crumpled 8.7
crack 8.7
detailed 8.6
glass 8.6
textures 8.5
historical 8.5
sign 8.3
ornate 8.2
symbol 8.1
distressed 7.8
architecture 7.8
leaf 7.8
space 7.8
blackboard 7.7
blank 7.7
rustic 7.7
page 7.4
document 7.4
backdrop 7.4
plants 7.4
gold 7.4
backgrounds 7.3
paint 7.2
border 7.2

Google
created on 2020-05-03

Drawing 77.9
Sketch 66.8
History 62.6
Art 62.5
Illustration 54.5
Artwork 54.3

Microsoft
created on 2020-05-03

sketch 99.7
drawing 99.6
text 96.3
old 94.9
cartoon 85.2
posing 81.5
standing 79.1
group 77.7
illustration 77.6
clothing 74.7
person 71.5
black 68.9
child art 66.6
engraving 64.3
white 60.8
team 28.1
stone 4.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Female, 50.3%
Fear 45%
Sad 45.1%
Confused 45%
Happy 45.1%
Surprised 45%
Calm 54.8%
Angry 45%
Disgusted 45%

AWS Rekognition

Age 23-37
Gender Male, 55%
Sad 45%
Angry 45.6%
Disgusted 45%
Surprised 45.8%
Calm 53.5%
Fear 45%
Happy 45%
Confused 45%

AWS Rekognition

Age 23-35
Gender Male, 53.3%
Disgusted 45%
Fear 45.1%
Confused 45%
Calm 54.5%
Surprised 45.1%
Sad 45.1%
Happy 45%
Angry 45.2%

AWS Rekognition

Age 26-40
Gender Male, 54.7%
Angry 45%
Fear 45%
Disgusted 45%
Surprised 45%
Happy 45%
Calm 55%
Sad 45%
Confused 45%

AWS Rekognition

Age 46-64
Gender Male, 53.5%
Angry 45.1%
Fear 52.1%
Confused 45%
Sad 45.2%
Disgusted 45.1%
Surprised 45.4%
Calm 46.7%
Happy 45.5%

AWS Rekognition

Age 32-48
Gender Male, 50.9%
Angry 45.9%
Fear 45%
Confused 45.1%
Sad 45.3%
Disgusted 45%
Surprised 45.2%
Calm 53.4%
Happy 45.1%

AWS Rekognition

Age 17-29
Gender Female, 52.1%
Confused 45%
Disgusted 45%
Calm 47.5%
Angry 45.4%
Happy 45%
Fear 45.1%
Surprised 45%
Sad 51.8%

AWS Rekognition

Age 18-30
Gender Male, 54.6%
Fear 45.6%
Sad 46.5%
Disgusted 45.5%
Angry 48.8%
Surprised 45.4%
Calm 47.7%
Confused 45.1%
Happy 45.5%

AWS Rekognition

Age 17-29
Gender Male, 52.9%
Fear 45.3%
Sad 45.3%
Disgusted 45.1%
Angry 45.6%
Surprised 46.3%
Calm 51.7%
Confused 45.1%
Happy 45.5%

Feature analysis

Amazon

Person 99%
Painting 64.9%

Categories

Text analysis

Amazon

lorg
eNtoib lorg
eNtoib
emalemavoe

Google

16307t bNolong
16307t
bNolong