Human Generated Data

Title

Suburban Musical

Date

20th century

People

Artist: James Daugherty, American 1887 - 1974

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Waltrud Lampé, 2006.256

Human Generated Data

Title

Suburban Musical

People

Artist: James Daugherty, American 1887 - 1974

Date

20th century

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Waltrud Lampé, 2006.256

Machine Generated Data

Tags

Amazon
created on 2020-04-30

Painting 99.6
Art 99.6
Human 97
Person 97
Person 64.4
Mural 59.5

Clarifai
created on 2020-04-30

art 99.9
painting 99.8
religion 98.8
illustration 98.7
artistic 97.7
saint 97.3
visuals 96.6
fresco 95.9
god 94.9
color 94.4
book 92.1
church 91.6
holy 91.6
interior 91.1
design 90.5
decoration 90
spirituality 89
culture 89
painter 88.5
mural 88.2

Imagga
created on 2020-04-30

art 40.7
mosaic 34
decoration 27.9
majolica 26.6
earthenware 24.8
colorful 20.1
ceramic ware 19.5
design 19.1
religion 18.8
painter 17.2
culture 17.1
old 16.7
texture 14.6
temple 14.4
pattern 14.4
money 13.6
sculpture 13.6
color 13.3
religious 13.1
artistic 13
utensil 12.8
graffito 12.6
currency 12.6
traditional 12.5
ancient 12.1
golden 12
painting 11.7
bank 11.6
creation 11.2
style 11.1
paper 11
gold 10.7
faith 10.5
statue 10.5
mask 10.4
cash 10.1
drawing 9.6
tradition 9.2
banking 9.2
tattoo 9.1
close 9.1
carving 9.1
retro 9
colors 8.8
graphic 8.8
antique 8.7
prayer 8.7
wall 8.6
transducer 8.5
travel 8.4
finance 8.4
china 8.4
modern 8.4
church 8.3
symbol 8.1
detail 8
covering 7.8
pray 7.8
flower 7.7
grunge 7.7
wallpaper 7.7
god 7.7
festival 7.7
decorative 7.5
artwork 7.3
financial 7.1
decor 7.1

Google
created on 2020-04-30

Painting 95.7
Art 92.8
Watercolor paint 70.3
Tapestry 68.4
Prophet 68.1
Textile 65.6
Mural 64.9
Miniature 63.2
Mythology 62.9
Modern art 60.9
Illustration 60.2
Visual arts 59.3
Style 51

Microsoft
created on 2020-04-30

painting 99.9
drawing 99.7
text 98.3
sketch 97.8
book 96.5
cartoon 93.4
child art 89.2
person 83.7
several 10

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 33-49
Gender Male, 53.9%
Disgusted 45%
Sad 46.3%
Fear 45%
Confused 45%
Calm 53.5%
Happy 45%
Surprised 45%
Angry 45%

AWS Rekognition

Age 23-37
Gender Male, 51.6%
Disgusted 45%
Happy 45%
Sad 54.3%
Fear 45%
Angry 45%
Calm 45.7%
Surprised 45%
Confused 45%

AWS Rekognition

Age 53-71
Gender Male, 53.7%
Fear 45%
Sad 45.5%
Calm 54.4%
Happy 45%
Angry 45.1%
Surprised 45%
Confused 45%
Disgusted 45%

AWS Rekognition

Age 36-52
Gender Male, 53.2%
Disgusted 45%
Calm 45.1%
Surprised 46.6%
Confused 45.1%
Happy 45%
Sad 45.1%
Fear 51.1%
Angry 46.9%

AWS Rekognition

Age 11-21
Gender Male, 52.7%
Angry 49.7%
Disgusted 45.1%
Fear 45.5%
Sad 47.2%
Confused 45.1%
Happy 45.1%
Calm 47.2%
Surprised 45.2%

AWS Rekognition

Age 22-34
Gender Female, 54.3%
Disgusted 45%
Sad 45.1%
Angry 45.1%
Calm 45.3%
Confused 45%
Fear 45.2%
Surprised 45.1%
Happy 54.3%

Feature analysis

Amazon

Painting 99.6%
Person 97%

Categories

Captions

Microsoft
created on 2020-04-30

a close up of a book 34.3%
close up of a book 29.8%
a close up of a book cover 29.7%