Human Generated Data

Title

Cecilia from Behind

Date

1983

People

Artist: Antonio Canet, Cuban 1942-2008

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Loan from the Fine Arts Library, gift of La Galeria Taller A. Canet, 62.2004.7

Human Generated Data

Title

Cecilia from Behind

People

Artist: Antonio Canet, Cuban 1942-2008

Date

1983

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Loan from the Fine Arts Library, gift of La Galeria Taller A. Canet, 62.2004.7

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Art 95.9
Human 95.7
Person 95.7
Poster 93.9
Advertisement 93.9
Person 93.1
Painting 92.6
Person 87.3
Person 66.3

Clarifai
created on 2019-10-29

illustration 97.9
people 97
art 96.7
retro 95
vector 94.8
man 93.8
chalk out 93.2
sketch 91.9
picture frame 91.9
paper 90.6
design 90.3
graphic 89.1
card 86.9
image 86.9
adult 85.7
painting 85.5
vintage 85.3
woman 84.3
bill 83.9
decoration 83.6

Imagga
created on 2019-10-29

book jacket 50.1
blackboard 47.9
jacket 39.9
vintage 33.1
envelope 30.7
wrapping 30.6
old 27.2
drawing 26.2
grunge 23.9
retro 23.8
stamp 23.4
sketch 22.7
paper 22
covering 21.7
black 18
note 17.5
frame 17.2
money 17
design 16.4
mail 16.3
letter 15.6
message 15.5
blank 15.4
art 14.6
card 14.6
texture 13.9
ancient 13.8
postmark 13.8
postage 13.8
antique 13
symbol 12.8
aged 12.7
currency 12.6
post 12.4
business 12.2
board 12
cash 11.9
representation 11.5
bill 11.4
empty 11.2
finance 11
border 10.9
postal 10.8
chalkboard 10.8
communication 10.1
decorative 10
bank 9.9
pattern 9.6
artistic 9.6
dollar 9.3
notebook 9.2
global 9.1
dirty 9
philately 8.9
chalk 8.8
grungy 8.5
container 8.3
silhouette 8.3
man 8.1
graphic 8
binding 7.9
circa 7.9
printed 7.9
decoration 7.8
insulating material 7.7
wall 7.7
sign 7.5
banking 7.4
object 7.3
collection 7.2
financial 7.1

Google
created on 2019-10-29

Microsoft
created on 2019-10-29

drawing 98.6
sketch 98.2
cartoon 95.6
illustration 94.8
person 93.4
text 93
gallery 92.8
clothing 89.2
room 87.4
scene 81.4
poster 64.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 21-33
Gender Male, 54.1%
Surprised 46.1%
Confused 45%
Fear 45.2%
Disgusted 45%
Sad 45%
Happy 45%
Calm 45.4%
Angry 53.3%

AWS Rekognition

Age 23-35
Gender Male, 53.1%
Disgusted 45%
Angry 48.9%
Sad 45.7%
Fear 45.1%
Confused 45.1%
Surprised 45.1%
Happy 45%
Calm 50.2%

AWS Rekognition

Age 22-34
Gender Male, 55%
Disgusted 45%
Angry 45%
Fear 45%
Calm 55%
Happy 45%
Sad 45%
Surprised 45%
Confused 45%

AWS Rekognition

Age 22-34
Gender Male, 54.6%
Fear 45%
Happy 45%
Surprised 45.2%
Calm 54.5%
Confused 45%
Disgusted 45%
Sad 45.1%
Angry 45.2%

AWS Rekognition

Age 28-44
Gender Female, 50.6%
Happy 45.1%
Angry 52.5%
Disgusted 45%
Surprised 46.4%
Sad 45.1%
Calm 45.1%
Confused 45%
Fear 45.7%

Microsoft Cognitive Services

Age 27
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.7%
Poster 93.9%

Categories

Imagga

paintings art 100%

Captions

Text analysis

Amazon

WWIR