Human Generated Data

Title

The Sower I (black)

Date

1897

People

Artist: Hans Thoma, German 1839 - 1924

Classification

Prints

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Wolfgang and Emmerich Bünemann, in honor of their father Hermann Bünemann, 2011.467

Human Generated Data

Title

The Sower I (black)

People

Artist: Hans Thoma, German 1839 - 1924

Date

1897

Classification

Prints

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Wolfgang and Emmerich Bünemann, in honor of their father Hermann Bünemann, 2011.467

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Art 100
Painting 100
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Person 97.3
Face 96.7
Head 96.7
Photography 96.7
Portrait 96.7
Person 96.6
Drawing 90.3
Animal 81.8
Horse 81.8
Mammal 81.8
Horse 64.5
Text 56.8
Archaeology 55.9

Clarifai
created on 2018-05-11

people 99.9
adult 98.8
print 98.5
illustration 98
man 97.1
one 96.6
art 95.7
engraving 92.1
wear 88.3
group 86.8
portrait 86.6
leader 85.5
woman 84.7
war 82.8
military 82.3
administration 82.1
veil 82
weapon 80.9
lid 80.1
soldier 78.7

Imagga
created on 2023-10-06

sketch 100
drawing 100
representation 96
art 29
statue 27.5
sculpture 21.6
old 20.9
detail 18.5
religion 17
ancient 16.4
religious 15
vintage 14.9
symbol 14.8
catholic 14.6
figure 14.6
holy 14.5
man 13.4
antique 13
stone 12.8
black 12.6
architecture 12.5
god 12.4
church 12
paper 11.8
history 11.6
body 11.2
artwork 11
marble 10.9
saint 10.6
artistic 10.4
culture 10.3
design 10.2
closeup 10.1
style 9.6
faith 9.6
grunge 9.4
famous 9.3
dress 9
retro 9
spiritual 8.6
elegant 8.6
travel 8.5
decoration 8.4
head 8.4
people 8.4
tourism 8.3
person 7.9
model 7.8
line 7.7
money 7.7
elegance 7.6
historical 7.5
sign 7.5
decorative 7.5
human 7.5
monument 7.5
backdrop 7.4
shape 7.4
portrait 7.1
face 7.1
male 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

text 99.6
book 96

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-28
Gender Female, 98.7%
Calm 86.9%
Confused 8.1%
Surprised 8%
Fear 6%
Sad 2.4%
Angry 0.4%
Happy 0.3%
Disgusted 0.2%

Microsoft Cognitive Services

Age 30
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Horse 81.8%

Categories

Imagga

paintings art 99.2%

Captions

Microsoft
created on 2018-05-11

a person holding a book 34%
an old photo of a person 33.9%

Text analysis

Amazon

Saewam