Human Generated Data

Title

Mark/Maquette IV

Date

1977

People

Artist: Chuck Close, American 1940 - 2021

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, 1994.37

Copyright

© Chuck Close

Human Generated Data

Title

Mark/Maquette IV

People

Artist: Chuck Close, American 1940 - 2021

Date

1977

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, 1994.37

Copyright

© Chuck Close

Machine Generated Data

Tags

Amazon
created on 2019-04-05

Head 99.2
Art 97.3
Poster 91.7
Advertisement 91.7
Collage 87.1
Text 82.7
Painting 78
Modern Art 64.6
Paper 62.3
Drawing 60.8
Face 59
Flyer 57.4
Brochure 57.4

Clarifai
created on 2018-04-19

illustration 94.3
desktop 93.4
portrait 92.7
face 92.3
paper 92
person 90.2
man 89.1
art 88.1
head 87.1
people 86
business 85.7
money 85.5
design 82.4
old 81.7
note 81.2
vintage 81
retro 80.9
one 79.3
color 78.3
image 78.2

Imagga
created on 2018-04-19

map 36.6
art 24.8
representation 23.3
digital 22.7
world 20.4
wallpaper 19.9
design 18
business 17.6
graphic 17.5
sculpture 17.2
futuristic 16.2
tile 15.9
technology 15.6
antique 15.6
grunge 15.3
sketch 15.2
bust 15
drawing 14.9
web 14.3
paper 14.3
vintage 14
modern 14
artistic 13.9
mosaic 13.9
pattern 13.7
global 13.7
plastic art 13.4
old 13.2
fractal 12.9
color 12.8
net 12.6
style 12.6
space 12.4
finance 11.8
texture 11.8
travel 11.3
chart 10.5
plan 10.4
globe 10.2
data 10
light 10
growth 9.7
international 9.5
network 9.3
template 9.2
retro 9
currency 9
3d 8.5
dynamic 8.5
frame 8.3
glowing 8.3
element 8.3
backdrop 8.2
atlas 8.1
dirty 8.1
financial 8
shiny 7.9
black 7.8
corporate 7.7
marble 7.7
money 7.7
direction 7.6
creation 7.6
tech 7.6
perspective 7.5
backgrounds 7.3
effect 7.3
relief 7.3
architecture 7.2
aged 7.2
market 7.1
information 7.1

Google
created on 2018-04-19

yellow 95.8
text 86.5
art 83.3
portrait 75.6
design 68.5
line 65.9
font 60
drawing 57.7
pattern 56.9
illustration 56.8
artwork 56.8
paint 56.4
modern art 53.4
painting 53.1
picture frame 52.9

Microsoft
created on 2018-04-19

yellow 77.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 38-57
Gender Female, 91.6%
Happy 3.7%
Disgusted 0.6%
Sad 37.9%
Angry 3.2%
Calm 38%
Surprised 3.6%
Confused 13.1%

Categories

Imagga

Text analysis

Amazon

your