Human Generated Data

Title

Scenes from the Harvesting of Grapes at Mâcon: Harvest Dinner

Date

19th century

People

Artist: Unidentified Artist,

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Paul J. Sachs and W. G. Russell Allen, 1938.108

Human Generated Data

Title

Scenes from the Harvesting of Grapes at Mâcon: Harvest Dinner

People

Artist: Unidentified Artist,

Date

19th century

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Paul J. Sachs and W. G. Russell Allen, 1938.108

Machine Generated Data

Tags

Amazon
created on 2020-05-02

Person 96
Human 96
Person 95.2
Art 94.1
Painting 89.4
Person 79.1
Horse 78.3
Animal 78.3
Mammal 78.3
Person 77
Workshop 73.7
Person 73.6
Drawing 72.2
Sketch 57.7
Person 52.1

Clarifai
created on 2020-05-02

people 100
group 99.8
print 99.8
adult 99.7
illustration 99.6
engraving 99.2
furniture 99
man 98
seat 97.7
art 97.5
leader 97.1
administration 96.6
many 95.1
several 94.8
two 94.4
wear 94
chair 91
military 91
weapon 90.3
vehicle 90.3

Imagga
created on 2020-05-02

sketch 100
drawing 100
representation 100
grunge 17.9
architecture 16.4
building 15.9
snow 15.4
winter 14.5
old 13.9
cold 13.8
landscape 13.4
vintage 13.2
empty 12
design 11.8
texture 11.8
structure 11.5
forest 11.3
antique 11.2
house 10.9
city 10.8
tree 10.8
space 10.1
window 10.1
park 9.9
travel 9.9
material 9.8
art 9.8
black 9.6
ancient 9.5
paper 9.4
wall 9.4
water 9.3
grain 9.2
frame 9.2
aged 9
retro 9
outdoors 9
style 8.9
graphic 8.8
frozen 8.6
season 8.6
construction 8.6
business 8.5
cool 8
grime 7.8
diagram 7.7
sky 7.7
old fashioned 7.6
plan 7.6
pattern 7.5
decorative 7.5
ice 7.4
peaceful 7.3
paint 7.2
religion 7.2
market 7.1
decoration 7.1
day 7.1

Google
created on 2020-05-02

Drawing 77.9
Art 77.2
Furniture 74.3
Sketch 72.1
Table 69.7
History 68.8
Room 65.7
Illustration 64
Artwork 59.1
Sitting 57.5

Microsoft
created on 2020-05-02

drawing 99.7
sketch 99.6
illustration 90.4
cartoon 90
text 88.8
child art 74.2
furniture 58.9
stone 5.9

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 20-32
Gender Female, 51.3%
Sad 45.1%
Angry 51.3%
Calm 48.2%
Disgusted 45%
Happy 45%
Confused 45%
Surprised 45.3%
Fear 45.1%

AWS Rekognition

Age 21-33
Gender Male, 50.6%
Confused 45%
Happy 45.1%
Fear 45.1%
Surprised 45%
Angry 45.1%
Sad 50.8%
Disgusted 45%
Calm 48.8%

AWS Rekognition

Age 20-32
Gender Female, 51.8%
Disgusted 45%
Surprised 45.1%
Confused 45%
Happy 46.5%
Angry 45.4%
Fear 45.1%
Calm 52.3%
Sad 45.5%

AWS Rekognition

Age 23-37
Gender Female, 50.2%
Surprised 45%
Disgusted 45.3%
Calm 53%
Fear 45%
Sad 45.2%
Confused 45%
Angry 46.5%
Happy 45%

AWS Rekognition

Age 26-40
Gender Male, 54.8%
Confused 45%
Surprised 45.1%
Disgusted 45%
Calm 49%
Sad 45.4%
Angry 48.2%
Happy 46.6%
Fear 45.6%

AWS Rekognition

Age 42-60
Gender Male, 51.6%
Disgusted 46%
Fear 45.4%
Confused 45.2%
Calm 47.2%
Angry 45.7%
Happy 45.5%
Sad 49.8%
Surprised 45.2%

AWS Rekognition

Age 24-38
Gender Female, 52.5%
Surprised 45%
Angry 45%
Calm 45.9%
Disgusted 45%
Happy 45%
Fear 45.8%
Sad 53.2%
Confused 45%

Microsoft Cognitive Services

Age 24
Gender Female

Feature analysis

Amazon

Person 96%
Painting 89.4%
Horse 78.3%

Categories

Imagga

paintings art 97.1%
text visuals 2.2%

Captions

Microsoft
created on 2020-05-02

a stone statue of a person 31.5%
a close up of a stone wall 31.4%
a stone wall 31.3%