Human Generated Data

Title

At the Market

Date

1931?

People

Artist: Jack Levine, American 1915-2010

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Gift, 1953.127

Copyright

© Jack Levine Estate / Artists Rights Society (ARS), New York

Human Generated Data

Title

At the Market

People

Artist: Jack Levine, American 1915-2010

Date

1931?

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Gift, 1953.127

Copyright

© Jack Levine Estate / Artists Rights Society (ARS), New York

Machine Generated Data

Tags

Amazon
created on 2020-04-25

Person 98.2
Human 98.2
Art 96
Painting 96
Drawing 85.9
Person 81
Sketch 71.9
Person 62.8

Clarifai
created on 2020-04-25

people 99.2
art 98.1
illustration 98
print 97.2
chalk out 96.7
man 96.6
adult 95.5
wear 95.4
vintage 94.9
group 94.4
paper 94.4
desktop 94.3
veil 93.3
sepia pigment 91.3
old 90.8
retro 90.7
antique 90.1
design 89.8
painting 89
ancient 84.6

Imagga
created on 2020-04-25

sketch 77.2
drawing 73.8
representation 46.2
design 31
pattern 28.8
art 26.3
atlas 26.2
graphic 25.6
element 22.4
decorative 21.8
wallpaper 19.2
vintage 19.1
map 18.3
decoration 17.4
retro 17.3
antique 17
shape 15.7
symbol 15.5
frame 15.2
style 14.9
old 14.7
cartoon 14.3
texture 13.9
grunge 13.7
floral 13.7
card 12.8
paper 12.6
decor 12.4
modern 12
set 11.9
border 11.8
sign 11.3
icon 11.1
ornate 11
flower 10.8
curve 10.5
plan 10.4
finance 10.2
capital 9.6
scroll 9.6
color 9.5
label 9.4
leaf 9.4
elements 9.3
clip art 9.3
painting 9.2
silhouette 9.1
backdrop 9.1
gold 9.1
black 9
relief 8.8
line 8.6
seamless 8.6
money 8.5
hand 8.5
financial 8
world 8
creative 8
location 7.9
artistic 7.8
sepia 7.8
travel 7.8
geography 7.7
navigation 7.7
arrow 7.6
journey 7.6
traditional 7.5
page 7.4
collection 7.2
currency 7.2
holiday 7.2
draw 7.2
space 7

Google
created on 2020-04-25

Drawing 96
Sketch 95.6
Art 77.2
Artwork 75.8
Figure drawing 70.1
Line art 66.9
Illustration 62.3
Painting 51.9

Microsoft
created on 2020-04-25

sketch 99.8
drawing 99.7
text 92.7
child art 92
art 90.1
cartoon 84.9
illustration 70.3
linedrawing 49.5

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 22-34
Gender Female, 86%
Disgusted 0.3%
Calm 19.5%
Surprised 0.3%
Angry 0.8%
Happy 1%
Sad 77%
Confused 0.7%
Fear 0.6%

AWS Rekognition

Age 19-31
Gender Female, 52.3%
Sad 45%
Confused 45%
Disgusted 45%
Happy 45.4%
Surprised 45%
Calm 54.4%
Angry 45.1%
Fear 45%

AWS Rekognition

Age 19-31
Gender Female, 97.2%
Confused 0.5%
Fear 0.3%
Angry 0.4%
Happy 5.1%
Calm 23.7%
Surprised 0.1%
Sad 69.7%
Disgusted 0.3%

AWS Rekognition

Age 14-26
Gender Male, 53.1%
Angry 45%
Surprised 45%
Happy 45%
Fear 45%
Disgusted 45%
Calm 54.9%
Sad 45%
Confused 45%

Microsoft Cognitive Services

Age 40
Gender Female

Feature analysis

Amazon

Person 98.2%
Painting 96%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2020-04-25

a drawing of a map 66.5%
a close up of a map 66.4%
drawing of a map 58.1%

Text analysis

Amazon

erive

Google

fask Lerine ee 6 1931
fask
Lerine
ee
6
1931