Human Generated Data

Title

Mark/Maquette II

Date

1977

People

Artist: Chuck Close, American 1940 - 2021

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, 1994.36

Copyright

© Chuck Close

Human Generated Data

Title

Mark/Maquette II

People

Artist: Chuck Close, American 1940 - 2021

Date

1977

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, 1994.36

Copyright

© Chuck Close

Machine Generated Data

Tags

Amazon
created on 2019-04-05

Accessory 99.6
Accessories 99.6
Glasses 99.6
Person 95.2
Human 95.2
Advertisement 85.7
Poster 85.7
Art 85.6
Collage 78.2
Drawing 70.7
Face 70.6
Text 69.1
Head 60.5
Portrait 58.5
Photography 58.5
Photo 58.5
Painting 56.2

Clarifai
created on 2018-04-19

portrait 98
people 97.1
person 95
man 94.8
face 89.9
adult 88.8
one 88.5
old 88.2
paper 88
eyeglasses 80.9
print 80.3
vintage 78.8
closeup 75.2
desktop 73.7
science 71.9
serious 71.6
scientist 71
human 70.8
head 70.7
retro 69.6

Imagga
created on 2018-04-19

jigsaw puzzle 48.4
puzzle 41.8
currency 35
money 34.9
cash 30.2
game 28.5
dollar 25.1
banking 23.9
art 22.7
bank 22.6
bill 21.9
financial 21.4
finance 21.1
face 20.6
banknote 18.4
portrait 17.5
paper 17.3
business 17
payment 15.4
wealth 15.3
mosaic 15
savings 14.9
economy 14.8
closeup 14.8
church 14.8
close 14.3
dollars 13.5
religion 13.4
pay 13.4
note 12.9
negative 12.9
us 12.5
exchange 12.4
rich 12.1
one 11.2
banknotes 10.8
vintage 10.7
hundred 10.6
notes 10.5
painter 10.5
decoration 10.2
film 10.1
man 10.1
people 10
president 9.8
economic 9.7
god 9.6
commerce 9.3
investment 9.2
mask 9
icon 8.7
finances 8.7
artist 8.7
colorful 8.6
person 8.5
old 8.4
fashion 8.3
pattern 8.2
bible 7.8
photographic paper 7.8
antique 7.8
bills 7.8
loan 7.7
faith 7.7
profit 7.7
painted 7.6
capital 7.6
famous 7.4
graffito 7.4
letter 7.3
paint 7.2
tile 7.1
market 7.1
interior 7.1

Google
created on 2018-04-19

blue 96.8
portrait 87.9
art 85.2
vision care 82.2
painting 74.4
glasses 73.9
paper 60.8
artwork 57.6
paint 51.7
paper product 50.9
eyewear 50.3

Microsoft
created on 2018-04-19

person 97.4
indoor 89.1
posing 61.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-44
Gender Male, 98.9%
Calm 55.5%
Surprised 7%
Happy 9.4%
Confused 10%
Disgusted 2.1%
Angry 3.4%
Sad 12.5%

Microsoft Cognitive Services

Age 46
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Glasses 99.6%
Person 95.2%

Categories

Imagga

Text analysis

Amazon

Mut