Human Generated Data

Title

Dancing Party

Date

19th century

People

Artist: Eugene de Rochas, French

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Marian H. Phinney, 1962.46

Human Generated Data

Title

Dancing Party

People

Artist: Eugene de Rochas, French

Date

19th century

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Marian H. Phinney, 1962.46

Machine Generated Data

Tags

Amazon
created on 2020-05-02

Human 95.5
Person 95.5
Art 95.4
Painting 95.4
Drawing 91.8
Person 87.1
Person 72.6
Person 70.3
Text 68.6
Sketch 66.4
Person 65.3
Person 61.4
Person 60.2
Advertisement 58.2
Person 54.6

Clarifai
created on 2020-05-02

people 99.8
print 99.6
illustration 99.5
art 99.2
group 99.1
adult 98.6
engraving 97.4
man 96.9
veil 95.5
vintage 94.7
wear 94.5
painting 94.2
antique 94.2
paper 93
woman 90.6
etching 89.7
many 89.2
old 87.9
desktop 86.6
visuals 86.6

Imagga
created on 2020-05-02

sketch 35.6
graffito 35.2
vintage 31.5
money 30.7
currency 30.5
drawing 30.4
cash 29.3
paper 28.5
decoration 28.2
old 25.8
envelope 25.8
ancient 25.1
dollar 23.2
retro 23
representation 22.9
finance 22.8
bank 22.4
grunge 22.2
banking 19.3
brass 19.1
antique 19.1
bill 19
memorial 18.8
financial 18.7
savings 18.7
structure 18.6
aged 18.1
stamp 17.4
dollars 17.4
art 17.1
business 16.4
wealth 16.2
texture 16
mail 14.4
close 14.3
postmark 13.8
note 13.8
hundred 13.6
one 13.5
rich 13
letter 12.9
newspaper 12.9
banknotes 12.7
banknote 12.6
exchange 12.4
pattern 12.3
investment 11.9
container 11.9
postage 11.8
bills 11.7
design 11.6
us 11.6
pay 11.5
united 11.4
economy 11.1
frame 10.8
postal 10.8
book jacket 10.7
states 10.6
product 10.5
border 10
history 9.9
binding 9.8
finances 9.6
payment 9.6
dirty 9
printed 8.9
detail 8.9
funds 8.8
notes 8.6
loan 8.6
profit 8.6
post 8.6
blank 8.6
creation 8.4
jacket 8.4
card 8.2
religion 8.1
market 8
philately 7.9
circa 7.9
covering 7.8
wall 7.7
culture 7.7
wallpaper 7.7
worn 7.7
symbol 7.4
backgrounds 7.3
global 7.3
black 7.2

Google
created on 2020-05-02

Microsoft
created on 2020-05-02

text 100
drawing 99.7
sketch 99.7
book 99.3
cartoon 86
illustration 75
painting 73.8
child art 72.7
person 70.7
man 57.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-46
Gender Female, 52%
Surprised 45%
Happy 45%
Confused 45%
Calm 54.6%
Disgusted 45%
Fear 45%
Sad 45.3%
Angry 45%

AWS Rekognition

Age 13-25
Gender Male, 53.4%
Surprised 45.2%
Calm 54.3%
Angry 45.1%
Sad 45.1%
Fear 45%
Confused 45%
Disgusted 45%
Happy 45.3%

AWS Rekognition

Age 37-55
Gender Female, 50.6%
Angry 45.5%
Disgusted 47.3%
Happy 46.3%
Sad 45.3%
Fear 45.1%
Surprised 45.2%
Confused 45.1%
Calm 50.1%

AWS Rekognition

Age 24-38
Gender Male, 54.9%
Surprised 45%
Disgusted 45%
Happy 45.1%
Angry 45.1%
Sad 45.7%
Calm 54.1%
Fear 45%
Confused 45%

AWS Rekognition

Age 12-22
Gender Female, 54.7%
Sad 45.1%
Fear 45.5%
Disgusted 45%
Angry 45.3%
Confused 45%
Surprised 45.4%
Happy 51.8%
Calm 46.9%

AWS Rekognition

Age 36-52
Gender Male, 54.5%
Calm 45%
Sad 45%
Disgusted 45%
Surprised 45.6%
Angry 45.1%
Fear 53.9%
Confused 45.3%
Happy 45%

AWS Rekognition

Age 18-30
Gender Female, 52.7%
Disgusted 45%
Confused 45%
Sad 45%
Calm 54.7%
Happy 45.3%
Angry 45%
Surprised 45%
Fear 45%

AWS Rekognition

Age 36-52
Gender Male, 52.5%
Disgusted 45%
Angry 54.6%
Calm 45.2%
Fear 45%
Surprised 45%
Confused 45%
Sad 45%
Happy 45%

AWS Rekognition

Age 26-40
Gender Female, 50.6%
Confused 45.2%
Sad 45.8%
Fear 50.9%
Happy 45.1%
Angry 46.9%
Disgusted 45.5%
Surprised 45.5%
Calm 45.1%

AWS Rekognition

Age 28-44
Gender Female, 50.5%
Angry 45.2%
Disgusted 45%
Fear 45.5%
Sad 47.3%
Happy 48.7%
Surprised 45.3%
Calm 48%
Confused 45.1%

AWS Rekognition

Age 29-45
Gender Male, 54.3%
Disgusted 45.3%
Calm 45.4%
Angry 46.7%
Confused 45.4%
Fear 47.5%
Happy 45.8%
Sad 48.8%
Surprised 45.1%

AWS Rekognition

Age 29-45
Gender Male, 54.2%
Sad 45%
Disgusted 45%
Happy 45%
Surprised 45%
Calm 54.9%
Angry 45%
Fear 45%
Confused 45%

Feature analysis

Amazon

Person 95.5%
Painting 95.4%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2020-05-02

a close up of a book 60%
close up of a book 54.8%
a photo of a book 54.7%

Text analysis

Amazon

1962.46