Human Generated Data

Title

Suit Shopping: An Engraved Narrative (diptych)

Date

2002

People

Artist: Andrew Raftery, American born 1962

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M25820.A-B

Copyright

© 2002 Andrew Raftery

Human Generated Data

Title

Suit Shopping: An Engraved Narrative (diptych)

People

Artist: Andrew Raftery, American born 1962

Date

2002

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M25820.A-B

Copyright

© 2002 Andrew Raftery

Machine Generated Data

Tags

Amazon
created on 2019-10-30

Person 99.4
Human 99.4
Person 99.1
Person 98.1
Art 97
Person 91.7
Person 86.5
Drawing 85.6
Person 79.6
Person 78.1
Painting 77.9
Sketch 75.8
Person 65

Clarifai
created on 2019-10-30

illustration 99.9
art 99.6
people 99.1
group 98.5
painting 97
furniture 96.7
man 96.2
adult 95.4
museum 94.5
print 91.8
room 91.3
many 91.3
chair 90.9
exhibition 90.3
wear 89.9
one 89.2
seat 88.8
woman 88.7
sculpture 88.6
two 88.2

Imagga
created on 2019-10-30

sketch 100
drawing 88.2
representation 68.6
paper 29.8
money 28.1
currency 26
cash 25.6
old 23.7
bank 21.5
dollar 21.3
wealth 20.6
finance 19.4
banking 19.3
exchange 18.1
bill 18.1
vintage 17.4
grunge 17
business 17
texture 16.7
art 16.4
bills 15.5
us 15.4
design 15.2
antique 14.9
pattern 14.3
history 14.3
protective covering 13.8
hundred 13.5
dollars 13.5
financial 13.4
savings 13
rich 13
ancient 13
wall 12.8
pay 12.5
retro 12.3
one 11.9
investment 11.9
binding 11.6
fire screen 11.1
covering 11
note 11
frame 10.9
decoration 10.9
states 10.6
architecture 10.4
style 10.4
home 10.4
empty 10.3
close 10.3
banknote 9.7
black 9.6
pile 9.4
screen 9.3
aged 9
material 8.9
success 8.8
detail 8.8
decor 8.8
symbol 8.7
payment 8.7
ornament 8.6
united 8.6
culture 8.5
stone 8.5
number 8.4
house 8.4
binder 8.3
sign 8.3
backdrop 8.2
paint 8.1
border 8.1
sculpture 7.9
building 7.9
box 7.8
rate 7.8
travel 7.7
blank 7.7
profit 7.6
card 7.6
historical 7.5
commercial 7.5
monument 7.5
economy 7.4
rough 7.3
landmark 7.2
collection 7.2
market 7.1

Google
created on 2019-10-30

Microsoft
created on 2019-10-30

drawing 99.5
sketch 99.4
gallery 96.5
room 83.2
person 81.8
cartoon 80.2
clothing 79.2
art 78.8
text 75.9
illustration 68.3
museum 66.6
picture frame 28.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-30
Gender Female, 54.2%
Calm 50.7%
Angry 45.3%
Fear 45.1%
Confused 45.1%
Sad 45.1%
Disgusted 45.1%
Happy 48.2%
Surprised 45.5%

AWS Rekognition

Age 22-34
Gender Male, 52.4%
Fear 45%
Angry 45.1%
Confused 45%
Calm 54.8%
Sad 45%
Happy 45%
Surprised 45%
Disgusted 45%

AWS Rekognition

Age 22-34
Gender Male, 54.6%
Confused 45%
Angry 45%
Fear 45%
Sad 45%
Surprised 45%
Disgusted 45%
Happy 45%
Calm 55%

AWS Rekognition

Age 30-46
Gender Female, 50.8%
Sad 45%
Fear 45.1%
Angry 45.2%
Disgusted 45%
Surprised 45.1%
Calm 54.5%
Confused 45%
Happy 45.1%

AWS Rekognition

Age 39-57
Gender Female, 50%
Fear 49.6%
Sad 50.3%
Calm 49.5%
Confused 49.5%
Disgusted 49.5%
Happy 49.5%
Angry 49.6%
Surprised 49.5%

AWS Rekognition

Age 47-65
Gender Male, 50.5%
Sad 49.6%
Happy 49.5%
Confused 49.6%
Surprised 49.8%
Calm 49.9%
Disgusted 49.5%
Fear 49.5%
Angry 49.5%

AWS Rekognition

Age 36-54
Gender Female, 50.1%
Surprised 49.6%
Calm 50.2%
Disgusted 49.5%
Happy 49.6%
Fear 49.5%
Sad 49.5%
Confused 49.5%
Angry 49.5%

Feature analysis

Amazon

Person 99.4%
Painting 77.9%

Categories

Imagga

paintings art 99.6%

Captions

Microsoft
created on 2019-10-30

a room with art on the wall 63.3%
a close up of a window 41.2%
a window in a room 41.1%