Human Generated Data

Title

Breadline

Date

1932

People

Artist: Reginald Marsh, American 1898 - 1954

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Meta and Paul J. Sachs, M13174

Copyright

© Estate of Reginald Marsh / Art Students League, New York / Artists Rights Society (ARS), New York

Human Generated Data

Title

Breadline

People

Artist: Reginald Marsh, American 1898 - 1954

Date

1932

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Person 98.4
Human 98.4
Person 97.5
Art 97.5
Person 97.3
Person 96.2
Person 96
Person 95.3
Person 91.3
Person 89
Person 87.1
Person 87
Person 82.1
Painting 78.5
Person 71.8
Person 65.8

Clarifai
created on 2019-10-29

print 99.5
illustration 99.2
people 99.1
art 98.9
one 97.5
engraving 97.4
wear 97
group 96.5
adult 95.9
text 95
no person 94.3
many 93.9
war 92.7
vintage 91.5
old 91.4
military 91.3
man 90.2
desktop 89.6
two 88.5
ancient 87.9

Imagga
created on 2019-10-29

container 30.4
old 20.2
vintage 19.8
paper 18.8
grunge 18.7
bag 17.9
arabesque 17
fastener 16.6
currency 16.1
wallet 14.9
antique 14.2
decoration 13.8
design 13.5
finance 13.5
art 13.2
ancient 13
money 12.8
texture 12.5
drawing 12.4
wallpaper 12.2
black 12
close 12
restraint 11.8
pattern 11.6
retro 11.5
detail 11.3
business 10.9
wood 10.8
frame 10.8
sketch 10.7
gold 10.7
style 10.4
case 10.2
needle 10.2
symbol 10.1
border 9.9
backdrop 9.9
tray 9.9
backgrounds 9.7
dollar 9.3
cash 9.1
lace 9.1
dirty 9
wealth 9
hair slide 8.9
object 8.8
graphic 8.7
ornament 8.6
nobody 8.5
damask 8.5
floral 8.5
card 8.5
color 8.3
bank 8.1
buckle 8
receptacle 8
decor 8
empty 7.7
blank 7.7
wall 7.7
bill 7.6
sign 7.5
closeup 7.4
clip 7.4
note 7.3
historic 7.3
paint 7.2
aged 7.2
metal 7.2
financial 7.1

Google
created on 2019-10-29

Drawing 87.8
Art 77.7
Sketch 72.1
Visual arts 71.1
Artwork 68.7
Modern art 66.9
Painting 60.3
Printmaking 54.7
Illustration 54.5
Collection 52.1

Microsoft
created on 2019-10-29

sketch 94
text 93.3
drawing 93
person 73.5
museum 69.8
art 64.8

Face analysis

Amazon

AWS Rekognition

Age 26-40
Gender Male, 55%
Confused 45%
Angry 45%
Fear 45%
Sad 45%
Surprised 45%
Disgusted 45%
Happy 45%
Calm 55%

AWS Rekognition

Age 23-35
Gender Female, 50.2%
Confused 45%
Disgusted 45%
Calm 45%
Angry 53.9%
Happy 45%
Sad 45%
Surprised 45%
Fear 46%

AWS Rekognition

Age 18-30
Gender Male, 54.3%
Sad 45%
Angry 54.6%
Happy 45.1%
Confused 45%
Disgusted 45%
Calm 45%
Surprised 45%
Fear 45.3%

AWS Rekognition

Age 30-46
Gender Male, 54.1%
Angry 45.7%
Happy 45.1%
Confused 45%
Surprised 45%
Disgusted 45%
Sad 52.1%
Fear 45.2%
Calm 46.9%

AWS Rekognition

Age 17-29
Gender Female, 50.3%
Sad 45.2%
Happy 45.5%
Confused 45.4%
Surprised 46.2%
Calm 45.7%
Disgusted 51.1%
Fear 45.6%
Angry 45.4%

AWS Rekognition

Age 20-32
Gender Male, 50.5%
Happy 49.5%
Calm 49.6%
Sad 50.4%
Angry 49.5%
Surprised 49.5%
Fear 49.5%
Confused 49.5%
Disgusted 49.5%

AWS Rekognition

Age 23-37
Gender Male, 54.8%
Surprised 45%
Happy 45%
Angry 45%
Fear 45%
Disgusted 45%
Confused 45%
Sad 45%
Calm 54.9%

Feature analysis

Amazon

Person 98.4%
Painting 78.5%

Captions

Microsoft

a close up of a curtain 38%
close up of a curtain 30.8%
a close up of a white wall 30.7%

Text analysis

Google

Breadlut
/193
Breadlut /193