Human Generated Data

Title

Man Seated at a Table No. 2

Date

c. 1937

People

Artist: Theodore Roszak, American 1907 - 1981

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Loan in honor of Harry Cooper, 15.2005

Copyright

© Sarah Jane Roszak / Artists Rights Society (ARS), New York NY

Human Generated Data

Title

Man Seated at a Table No. 2

People

Artist: Theodore Roszak, American 1907 - 1981

Date

c. 1937

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous Loan in honor of Harry Cooper, 15.2005

Copyright

© Sarah Jane Roszak / Artists Rights Society (ARS), New York NY

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Human 97.2
Art 90.2
Person 84.9
Modern Art 78.4
Kneeling 69.9
Painting 64.3
Drawing 55.9

Clarifai
created on 2019-10-29

people 97.5
adult 97.2
one 97.1
painting 96.9
art 95.3
woman 94
wear 92.5
man 90.8
portrait 90.3
sculpture 87.7
nude 87.6
no person 87
veil 86.7
illustration 83.3
religion 82.2
face 79.6
ancient 74.8
print 73.8
god 69.9
indoors 68.1

Imagga
created on 2019-10-29

book jacket 100
jacket 100
wrapping 84.1
covering 60.2
black 33.6
dark 26.7
adult 20
portrait 19.4
man 18.8
person 18.8
male 17.7
art 16.8
people 16.2
one 15.7
model 14.8
sexy 14.4
studio 14.4
body 14.4
attractive 14
light 13.4
vintage 13.2
mask 12.7
face 12.1
human 12
evil 11.7
culture 11.1
symbol 10.8
night 10.7
fashion 10.5
lady 10.5
painted 10.5
make 10
postmark 9.9
old 9.8
stamp 9.7
looking 9.6
mail 9.6
post 9.5
eyes 9.5
letter 9.2
silhouette 9.1
pretty 9.1
painter 9.1
religion 9
figure 8.9
postage 8.8
office 8.8
postal 8.8
paintings 8.8
design 8.8
spooky 8.8
envelope 8.8
scary 8.7
holiday 8.6
fire 8.4
church 8.3
pumpkin 8
hair 7.9
icon 7.9
masterpiece 7.9
known 7.9
shows 7.9
printed 7.9
cutting 7.7
wall 7.7
grunge 7.7
seductive 7.6
fine 7.6
hot 7.5

Google
created on 2019-10-29

Microsoft
created on 2019-10-29

text 99.5
art 99
drawing 98.7
book 98.5
painting 96.9
sketch 95.9
cartoon 93.5
orange 85.5
poster 85
person 79.1
illustration 76.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Male, 95.3%
Happy 0.1%
Calm 1.9%
Sad 93.9%
Angry 2%
Disgusted 0.1%
Surprised 0.1%
Fear 1.6%
Confused 0.4%

Feature analysis

Amazon

Person 84.9%
Painting 64.3%

Categories

Imagga

interior objects 82.6%
paintings art 15.1%
food drinks 1.7%

Captions

Microsoft
created on 2019-10-29

an orange and black text 44%