Human Generated Data

Title

To the ovens

Date

1987

People

Artist: Murray Zimiles, American born 1941

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, 2018.33.62.8

Copyright

© Murray Zimiles

Human Generated Data

Title

To the ovens

People

Artist: Murray Zimiles, American born 1941

Date

1987

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, 2018.33.62.8

Copyright

© Murray Zimiles

Machine Generated Data

Tags

Amazon
created on 2020-02-04

Human 99.8
Person 99.8
Person 99.5
Drawing 96.1
Art 96.1
Person 95.3
Wood 94.7
Sketch 93.1
Person 85.9
People 69.1
Portrait 67.2
Photography 67.2
Photo 67.2
Face 67.2
Person 62.3
Painting 50.8

Clarifai
created on 2020-02-04

people 99.9
print 99.4
adult 99.2
group 98.7
man 98.1
art 97.9
illustration 97.5
two 95.9
wear 94.4
watercraft 93.9
engraving 93.4
vehicle 92.7
etching 91.1
woman 90.7
three 89.2
woodcut 88.9
administration 86.8
war 86.5
four 86.3
veil 85.3

Imagga
created on 2020-02-04

stick 30.9
crutch 28
water 26
beach 25.7
sketch 24.8
ocean 22.5
sea 21.9
staff 21.8
man 20.8
drawing 20.7
sunset 19.8
people 18.4
travel 18.3
silhouette 16.5
sand 16.3
male 15.6
sky 15.3
swab 15.3
outdoors 14.9
shore 14.9
representation 14.8
sun 14.5
black 14.4
coast 14.4
hockey stick 13.6
river 13.3
cleaning implement 13.1
reflection 13
leisure 12.4
vacation 12.3
lake 11.9
fisherman 11.9
person 11.7
summer 11.6
boat 11.3
men 11.2
landscape 11.1
old 11.1
newspaper 10.9
wave 10.4
adult 10.3
waves 10.2
sport 10.1
cleaner 9.5
cold 9.5
sunrise 9.4
clouds 9.3
recreation 9
daily 9
light 8.7
sports equipment 8.3
alone 8.2
horizon 8.1
sunlight 8
product 8
dawn 7.7
dusk 7.6
walking 7.6
bay 7.5
coastline 7.5
active 7.2
activity 7.2
romantic 7.1
mountain 7.1
ship 7

Google
created on 2020-02-04

Art 77.7
Illustration 75.4
Drawing 74.2
Visual arts 73.1
Painting 68.7
Stock photography 66.6
Sketch 60.3
Artwork 51.1

Microsoft
created on 2020-02-04

text 100
book 99.7
sketch 99.5
drawing 99.5
outdoor 99.5
person 87.2
cartoon 85.9
illustration 84.8
art 77.7
painting 67.7
man 62.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-35
Gender Male, 82.5%
Happy 0.2%
Calm 29.2%
Sad 5.4%
Confused 1.1%
Disgusted 0.7%
Fear 10.6%
Angry 17.3%
Surprised 35.4%

AWS Rekognition

Age 22-34
Gender Female, 52.4%
Fear 2.3%
Calm 24.6%
Angry 32.1%
Sad 36.3%
Disgusted 0.3%
Happy 1.5%
Surprised 0.9%
Confused 2.1%

AWS Rekognition

Age 45-63
Gender Male, 78.4%
Disgusted 0.1%
Happy 0.2%
Fear 10.8%
Confused 0.1%
Surprised 2.3%
Calm 0.5%
Sad 0.3%
Angry 85.7%

Feature analysis

Amazon

Person 99.8%
Painting 50.8%

Categories

Imagga

paintings art 71.3%
nature landscape 27.7%

Captions

Microsoft
created on 2020-02-04

a man holding a book 38.8%
a group of people on a beach 38.7%
an old photo of a man 38.6%