Human Generated Data

Title

Between Rounds

Date

20th century

People

Artist: Hyman Bloom, American 1913 - 2009

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Gift, 1953.199

Copyright

© Hyman Bloom Estate

Human Generated Data

Title

Between Rounds

People

Artist: Hyman Bloom, American 1913 - 2009

Date

20th century

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Gift, 1953.199

Copyright

© Hyman Bloom Estate

Machine Generated Data

Tags

Amazon
created on 2019-05-31

Art 97.7
Human 92.3
Person 84.9
Painting 79.4
Drawing 77.2
Person 62.2
Sketch 59.2

Clarifai
created on 2019-05-31

people 99.9
adult 98.4
two 98.3
group 98.1
art 97.8
print 96.6
man 96.2
illustration 93.7
one 93.1
mammal 92.5
three 91.7
canine 91.1
woman 90.9
administration 90.5
cavalry 89.9
wear 89.2
child 87.7
engraving 86.3
many 83
music 82.5

Imagga
created on 2019-05-31

fountain 47.7
structure 34.9
statue 33.6
sculpture 29.6
sketch 24.3
stone 21.7
art 20.2
drawing 18.8
people 17.9
ancient 17.3
old 16.7
column 16.5
architecture 16.5
marble 15.5
outdoor 15.3
religion 15.2
monument 15
travel 14.1
representation 14
portrait 13.6
culture 12.8
historical 12.2
person 12
black 12
body 12
figure 11.8
tourism 11.6
face 11.4
antique 11.3
city 10.8
history 10.7
cemetery 10.6
god 10.5
sexy 10.4
religious 10.3
love 10.3
historic 10.1
man 9.7
building 9.6
decoration 9.4
groom 9.2
sepia 8.7
grunge 8.5
memorial 8.5
adult 8.4
head 8.4
summer 8.4
traditional 8.3
child 8.3
vintage 8.3
human 8.3
landmark 8.1
lady 8.1
detail 8
snow 8
carving 7.8
roman 7.8
outdoors 7.5
famous 7.4
style 7.4
dress 7.2
male 7.1
temple 7.1
sky 7

Google
created on 2019-05-31

Drawing 86.8
Art 82.6
Painting 77.1
Sketch 75.8
Figure drawing 71.2
Illustration 66.9
Artwork 65.2
Stock photography 59.4
Visual arts 55
Chest 53.9
Trunk 53.9

Microsoft
created on 2019-05-31

sketch 98.4
drawing 98.2
text 92
painting 90.2
old 61.1
cartoon 52.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 74.3%
Surprised 5.9%
Sad 24.1%
Happy 2.4%
Angry 22.7%
Disgusted 20%
Confused 14.7%
Calm 10.3%

Feature analysis

Amazon

Person 84.9%
Painting 79.4%

Categories

Captions

Microsoft
created on 2019-05-31

an old photo of a person 80%
an old photo of a person 78.6%
an old photo of a girl 61.7%

Text analysis

Amazon

H.Rloom
H
H H H.Rloom