Human Generated Data

Title

The Sower I (brown)

Date

1897

People

Artist: Hans Thoma, German 1839 - 1924

Classification

Prints

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Wolfgang and Emmerich Bünemann, in honor of their father Hermann Bünemann, 2011.466

Human Generated Data

Title

The Sower I (brown)

People

Artist: Hans Thoma, German 1839 - 1924

Date

1897

Classification

Prints

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Wolfgang and Emmerich Bünemann, in honor of their father Hermann Bünemann, 2011.466

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Art 100
Painting 100
Face 99.3
Head 99.3
Photography 99.3
Portrait 99.3
Person 99.2
Adult 99.2
Male 99.2
Man 99.2
Person 96.9
Person 93.4
Archaeology 84.1
Animal 74.6
Horse 74.6
Mammal 74.6
Drawing 70.7

Clarifai
created on 2018-05-11

people 99.7
adult 98.8
art 98.6
one 97.2
illustration 97
man 96.9
print 96.7
portrait 94.3
painting 90.1
wear 88.6
engraving 87.4
woman 83.3
Renaissance 83
antique 82.4
veil 79.7
nude 79.5
lid 78.7
vintage 77.8
old 74.3
gown (clothing) 71.8

Imagga
created on 2023-10-06

statue 52.6
sculpture 37.8
art 31.7
sketch 29
religion 26.9
old 24.4
ancient 24.2
architecture 22.1
drawing 21.8
catholic 19.5
representation 19.1
history 18.8
religious 18.8
antique 18.7
detail 18.5
church 18.5
cemetery 18.4
god 18.2
stone 18.1
monument 17.8
famous 17.7
marble 17.5
culture 17.1
holy 15.4
saint 15.4
figure 15.1
vintage 14.9
tourism 14.9
landmark 14.5
column 14.3
symbol 14.2
historical 14.1
building 13.5
spirituality 13.4
travel 13.4
city 13.3
carving 12.3
historic 11
museum 10.9
closeup 10.8
faith 10.5
roman 10.3
fountain 10.1
ruler 9.8
decoration 9.7
spiritual 9.6
cathedral 9.6
cross 9.4
temple 8.9
angel 8.8
pray 8.7
artistic 8.7
face 8.5
sand 8.4
decorative 8.4
tourist 8.2
book jacket 8.2
statues 7.9
figurine 7.9
carved 7.8
golden 7.7
traditional 7.5
man 7.4
dress 7.2
design 7.2
newspaper 7.1
paper 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

text 86.6
old 53.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-24
Gender Female, 90.4%
Calm 64.4%
Surprised 25.9%
Angry 10.9%
Fear 6.1%
Confused 2.8%
Sad 2.3%
Happy 1%
Disgusted 0.4%

Microsoft Cognitive Services

Age 31
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Adult 99.2%
Male 99.2%
Man 99.2%
Horse 74.6%

Categories

Imagga

paintings art 96.8%
events parties 1.2%

Captions

Microsoft
created on 2018-05-11

an old photo of a person 62.4%
old photo of a person 56.8%
a old photo of a person 56.7%