Human Generated Data

Title

Point of Order

Date

20th century

People

Artist: William Gropper, American 1897 - 1977

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Louise E. Bettens Fund, M10032

Human Generated Data

Title

Point of Order

People

Artist: William Gropper, American 1897 - 1977

Date

20th century

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Louise E. Bettens Fund, M10032

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Human 99.3
Person 99.3
Person 97.1
Person 94.2
Art 91.3
Person 86.1
Furniture 85.3
Painting 78.1
Drawing 75.1
Person 73.5
Text 71.1
Table 65.7
Photo 63.5
Photography 63.5
Desk 62.7
Face 62.7
Portrait 62.7
Sitting 59.8
Tabletop 58.1
Sketch 57.3

Clarifai
created on 2019-10-29

people 99.9
adult 99.6
print 99.1
group 98.7
one 98.4
man 97.4
wear 96.7
furniture 95.4
two 95.3
lithograph 94.4
woman 94.3
vehicle 94.1
art 91.7
illustration 89.8
seat 89.5
facial expression 88.8
administration 87.9
war 87.4
leader 86.4
portrait 85.7

Imagga
created on 2019-10-29

newspaper 100
product 90.4
creation 70.1
person 21.3
scholar 21.2
man 20.8
laptop 17.4
people 17.3
sitting 17.2
intellectual 16.9
sea 14.8
business 14.6
water 14
relaxation 13.4
book 13
male 12.8
travel 12.7
beach 12.6
relax 12.6
ocean 11.8
adult 11.7
happy 11.3
computer 11.2
attractive 10.5
money 10.2
lifestyle 10.1
relaxing 10
smile 10
vacation 9.8
education 9.5
smiling 9.4
leisure 9.1
alone 9.1
old 9.1
sky 8.9
technology 8.9
home 8.8
work 8.6
happiness 8.6
two 8.5
outdoor 8.4
lake 8.3
one 8.2
landscape 8.2
bank 8.1
working 7.9
hair 7.9
indoors 7.9
together 7.9
couple 7.8
black 7.8
couch 7.7
lonely 7.7
professional 7.7
resting 7.6
down 7.6
outdoors 7.5
office 7.3
calm 7.3
bench 7.3
looking 7.2
sunset 7.2
currency 7.2
daily 7.1
women 7.1
summer 7.1

Google
created on 2019-10-29

Microsoft
created on 2019-10-29

text 99.8
drawing 99.7
sketch 99.4
book 99.2
painting 92.7
person 87.3
cartoon 83.6
illustration 83.5
man 74.3
clothing 73.1
child art 70

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-33
Gender Male, 91.3%
Confused 0.3%
Fear 0.8%
Surprised 0.2%
Angry 93%
Happy 0.4%
Sad 3%
Disgusted 0.4%
Calm 2%

AWS Rekognition

Age 24-38
Gender Male, 54.9%
Disgusted 45%
Fear 45%
Sad 45.3%
Happy 45%
Angry 45%
Calm 54.3%
Confused 45%
Surprised 45.3%

AWS Rekognition

Age 19-31
Gender Female, 50.7%
Happy 45.4%
Calm 52.4%
Sad 45.6%
Angry 45.6%
Surprised 45.6%
Fear 45.4%
Confused 45.1%
Disgusted 45%

AWS Rekognition

Age 49-67
Gender Male, 96%
Disgusted 0.3%
Sad 26.7%
Fear 3.9%
Surprised 0.5%
Happy 0.4%
Calm 31.3%
Confused 1.1%
Angry 35.8%

Feature analysis

Amazon

Person 99.3%

Categories

Captions

Microsoft
created on 2019-10-29

a book on a bed 26.2%
a close up of a person holding a book 26.1%
a person holding a book 26%

Text analysis

Amazon

GBaul-

Google

P et
P
et