Human Generated Data

Title

Café Scene

Date

1906

People

Artist: Denman Waldo Ross, American 1853 - 1935

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Denman W. Ross, Class of 1875, 1936.154.107

Human Generated Data

Title

Café Scene

People

Artist: Denman Waldo Ross, American 1853 - 1935

Date

1906

Classification

Drawings

Machine Generated Data

Tags

Amazon
created on 2020-04-26

Art 96.1
Furniture 94.4
Chair 94.4
Painting 92.7
Person 84.9
Human 84.9
Person 83.4
Person 83.3
Person 81.2
Person 60.6

Clarifai
created on 2020-04-26

people 99.9
print 99.5
art 99.4
illustration 99
group 98.8
adult 98.5
man 97.7
painting 96
sepia pigment 92.9
wear 92
engraving 91.9
cavalry 91
vintage 89.9
veil 89.7
street 88.8
woman 88.6
old 87.8
paper 87.6
interaction 83.7
woodcut 83.3

Imagga
created on 2020-04-26

sketch 100
drawing 100
representation 98.4
art 30.8
architecture 28.2
sculpture 26.9
old 24.4
ancient 23.4
building 23.2
statue 22.1
history 21.5
famous 19.5
city 19.1
landmark 19
stone 18.6
tourism 16.5
monument 15.9
travel 15.5
antique 15.3
vintage 14.9
grunge 14.5
god 13.4
historical 13.2
culture 12.8
texture 11.8
religion 11.6
temple 11.5
artistic 11.3
detail 11.3
painting 10.8
symbol 10.8
wall 10.3
paper 10.2
church 10.2
tourist 10
carving 9.8
sky 9.6
man 9.4
column 9.3
exterior 9.2
decoration 9.2
aged 9
design 9
landscape 8.9
style 8.9
marble 8.9
urban 8.7
water 8.7
structure 8.6
palace 8.5
fountain 8.5
historic 8.3
ornate 8.2
figure 8.2
brown 8.1
black 7.8
baroque 7.8
facade 7.8
holy 7.7
pattern 7.5
traditional 7.5
currency 7.2

Google
created on 2020-04-26

Art 82.4
Painting 77.5
Drawing 76.4
Illustration 64
Visual arts 62.5
Sketch 56.6
Artwork 53.4

Microsoft
created on 2020-04-26

text 99.9
book 99.7
drawing 99.2
sketch 98.4
old 97.8
painting 92
person 71.3
cartoon 58.8
clothing 57.7
vintage 30.3

Face analysis

Amazon

AWS Rekognition

Age 34-50
Gender Male, 97.6%
Disgusted 0.1%
Surprised 3.6%
Fear 25.9%
Sad 12%
Angry 9.6%
Happy 1.2%
Calm 46.6%
Confused 1%

AWS Rekognition

Age 12-22
Gender Male, 52.9%
Sad 45%
Calm 45%
Disgusted 45%
Angry 54.3%
Confused 45%
Surprised 45.2%
Happy 45%
Fear 45.4%

AWS Rekognition

Age 21-33
Gender Female, 50.3%
Surprised 49.5%
Fear 49.5%
Angry 50.2%
Sad 49.5%
Disgusted 49.5%
Confused 49.5%
Calm 49.6%
Happy 49.6%

Feature analysis

Amazon

Chair 94.4%
Painting 92.7%
Person 84.9%

Captions

Microsoft

a vintage photo of a group of people 83.9%
an old photo of a group of people 83.8%
a vintage photo of some people 83.1%

Text analysis

Amazon

1196
bia

Google

7796
7796