Human Generated Data

Title

Illustration for "My Business Partner Gym"

Date

1913

People

Artist: Peter Sheaf Hersey Newell, American 1862 - 1924

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mrs. John G. Pierce, 1964.116

Human Generated Data

Title

Illustration for "My Business Partner Gym"

People

Artist: Peter Sheaf Hersey Newell, American 1862 - 1924

Date

1913

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Mrs. John G. Pierce, 1964.116

Machine Generated Data

Tags

Amazon
created on 2020-05-02

Nature 97.8
Human 97.6
Person 97.6
Person 96.6
Outdoors 96.5
Painting 91.9
Art 91.9
Ice 88.3
Snow 84.3
Clothing 80.2
Coat 80.2
Apparel 80.2
Person 76.3
Elephant 66.9
Animal 66.9
Wildlife 66.9
Mammal 66.9
Overcoat 58.5

Clarifai
created on 2020-05-02

people 99.9
adult 99.1
veil 99
wear 98.6
man 98.5
group 98.2
three 97.3
two 97
woman 95
lid 95
group together 94.8
child 94.7
monochrome 93.9
four 93.5
portrait 92.4
coat 91.7
street 89.1
outerwear 88.2
art 87.3
container 85.9

Imagga
created on 2020-05-02

old 27.2
sculpture 19.2
ancient 19
architecture 18
statue 17.2
stone 17
building 16.7
tourism 16.5
travel 16.2
history 16.1
wall 15.4
religion 15.2
robe 15.1
city 15
vintage 14.9
art 14.3
culture 13.7
monument 13.1
clothing 13.1
container 13
ruler 12.9
temple 12.3
antique 12.1
historic 11.9
black 11.8
people 11.7
garment 11.3
door 10.6
white 10.3
grunge 10.2
person 10
umbrella 9.9
scene 9.5
historical 9.4
covering 9.4
church 9.2
tradition 9.2
street 9.2
house 9.2
traditional 9.1
home 8.8
urban 8.7
holiday 8.6
world 8.5
tourist 8.3
metropolitan 8.2
aged 8.1
dirty 8.1
man 8.1
portrait 7.8
bag 7.7
texture 7.6
human 7.5
ashcan 7.4
vacation 7.4
bucket 7.3
seller 7.2
canopy 7.2
landmark 7.2
adult 7.2

Google
created on 2020-05-02

Microsoft
created on 2020-05-02

text 98.7
man 97.7
drawing 97.6
sketch 96.3
painting 87
person 86.8
clothing 80.3
black and white 74
waste container 73.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-56
Gender Male, 54.7%
Calm 45.6%
Surprised 52.2%
Fear 45.2%
Disgusted 45.1%
Angry 46.2%
Sad 45%
Happy 45.6%
Confused 45.1%

AWS Rekognition

Age 37-55
Gender Male, 52.1%
Confused 45.1%
Disgusted 45.1%
Happy 45.2%
Sad 45.4%
Calm 45.3%
Angry 47.1%
Fear 51.6%
Surprised 45.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 97.6%
Painting 91.9%
Elephant 66.9%

Categories

Imagga

paintings art 95.6%
interior objects 1.5%
food drinks 1.1%

Captions

Microsoft
created on 2020-05-02

a man standing next to a book 61%
a man holding a book 48.3%
an old photo of a man 48.2%

Text analysis

Amazon

DANE
DAnsi
PterN'sLn

Google

cANS DAN Rien Nawelle
cANS
DAN
Rien
Nawelle