Human Generated Data

Title

Genre Scene with a Man, a Woman and a Cat

Date

17th century

People

Artist: Gerbrand van den Eeckhout, Dutch 1621 - 1674

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Maida and George S. Abrams in honor of Agnes Mongan, 1982.137

Human Generated Data

Title

Genre Scene with a Man, a Woman and a Cat

People

Artist: Gerbrand van den Eeckhout, Dutch 1621 - 1674

Date

17th century

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Maida and George S. Abrams in honor of Agnes Mongan, 1982.137

Machine Generated Data

Tags

Amazon
created on 2020-04-25

Person 98.5
Human 98.5
Person 97.5
Art 95.4
Drawing 94.8
Painting 92.3
Sketch 83.4

Clarifai
created on 2020-04-25

people 99.8
print 99.4
art 99.1
group 98.4
man 98
adult 97.2
illustration 97.2
engraving 96.2
vintage 95.9
wear 95.1
veil 94.3
two 93.1
visuals 92.7
old 92.2
sepia pigment 91.7
antique 90.8
etching 89.9
painting 88.9
ancient 88.3
leader 88.2

Imagga
created on 2020-04-25

representation 100
sketch 100
drawing 100
vintage 27.3
grunge 26.4
art 25.4
retro 24.6
old 23
antique 20.8
ancient 20.8
paper 19.6
texture 18.8
design 17.5
aged 17.2
graphic 14.6
money 14.5
finance 14.4
pattern 14.4
currency 14.4
frame 14.2
style 14.1
close 12.6
decorative 12.5
grungy 12.3
note 12
history 11.6
wallpaper 11.5
cash 11
black 10.8
symbol 10.8
material 10.7
structure 10.4
canvas 10.4
decoration 10.4
detail 9.7
ornament 9.5
floral 9.4
business 9.1
silhouette 9.1
border 9
backgrounds 8.9
financial 8.9
color 8.9
fracture 8.8
artistic 8.7
flower 8.5
dollar 8.4
economy 8.3
grain 8.3
effect 8.2
global 8.2
dirty 8.1
bank 8.1
textured 7.9
grime 7.8
banknote 7.8
pay 7.7
obsolete 7.7
map 7.5
rich 7.5
banking 7.4
letter 7.3
historic 7.3
paint 7.2
creative 7.1
travel 7

Google
created on 2020-04-25

Microsoft
created on 2020-04-25

drawing 99.7
sketch 99.6
text 98.5
cartoon 87.7
child art 87.3
painting 79.3
illustration 74.3
old 53.1
stone 7.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 17-29
Gender Female, 52.2%
Angry 45.1%
Sad 46.1%
Happy 45.2%
Calm 53.3%
Surprised 45.1%
Fear 45.1%
Confused 45.1%
Disgusted 45.1%

AWS Rekognition

Age 34-50
Gender Female, 86.2%
Sad 26.5%
Surprised 2.5%
Happy 3%
Fear 1.8%
Angry 2.3%
Confused 46.6%
Disgusted 1.6%
Calm 15.6%

Feature analysis

Amazon

Person 98.5%
Painting 92.3%

Categories

Imagga

paintings art 99.7%

Captions

Microsoft
created on 2020-04-25

a close up of a book 29.5%
an old photograph of a book 29.4%