Human Generated Data

Title

Crackers

Date

1969

People

Artist: Edward Ruscha, American born 1937

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, M24197

Copyright

© Ed Ruscha

Human Generated Data

Title

Crackers

People

Artist: Edward Ruscha, American born 1937

Date

1969

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Furniture 98.3
Text 94.7
Bed 91.7
Human 89.6
Person 89.6
Cushion 81.8
Pillow 81.8
Face 78.5
Page 70.4
Female 66.8
Art 64.5
Drawing 64.5
Photography 63.2
Portrait 63.2
Photo 63.2
Bedroom 61
Room 61
Indoors 61
Girl 59.2
Newspaper 57.6

Clarifai
created on 2018-03-16

people 99.9
adult 99.5
one 99.4
woman 96.6
portrait 95.6
print 95.4
man 93.8
wear 93.1
music 92.2
furniture 91.8
two 91.6
book bindings 91
musician 86.8
art 86.6
recreation 85
vehicle 84.2
reclining 83.9
child 83.2
veil 83.1
book series 83.1

Imagga
created on 2018-03-16

person 28.6
adult 28.5
sofa 27.4
sketch 27.4
happy 25
newspaper 24.7
home 23.9
people 22.9
drawing 22.5
sitting 21.5
attractive 21
man 20.1
product 20.1
portrait 18.8
smile 18.5
smiling 18.1
couch 17.4
male 17.1
room 16.8
pretty 16.8
representation 16.8
looking 16
creation 15.7
one 15.7
model 15.5
book jacket 14.9
book 14.8
modern 14.7
computer 14.5
lifestyle 14.4
relaxation 14.2
women 14.2
indoors 14.1
jacket 13.6
relaxing 13.6
casual 13.5
relax 13.5
leisure 13.3
interior 13.3
cute 12.9
laptop 12.9
fashion 12.8
studio couch 12.7
resting 12.4
working 12.4
relaxed 12.2
technology 11.9
alone 11.9
covering 11.6
convertible 11.1
work 11
lady 10.5
reading 10.5
brunette 10.4
living 10.4
quilt 10.2
communication 10.1
indoor 10
house 10
rest 9.8
cheerful 9.7
sexy 9.6
couple 9.6
comfortable 9.5
hair 9.5
love 9.5
happiness 9.4
youth 9.4
face 9.2
human 9
wrapping 8.8
seat 8.6
lying 8.5
child 8.3
outdoors 8.2
bedclothes 8.1
bed 8.1
business 7.9
together 7.9
cold 7.7
wireless 7.6
joy 7.5
fun 7.5
holding 7.4
style 7.4
student 7.2
posing 7.1
chair 7.1

Google
created on 2018-03-16

Microsoft
created on 2018-03-16

indoor 88.6
bed 81.3

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 99.4%
Sad 5.6%
Calm 81.9%
Angry 2.9%
Surprised 1%
Confused 6.8%
Happy 0.8%
Disgusted 1%

Feature analysis

Amazon

Person 89.6%

Captions

Microsoft

a person lying on a bed 63.2%
a person lying in bed reading a book 37.3%
a person lying on a bed 37.2%

Text analysis

Amazon

Wniw

Google

FA
FA