Human Generated Data

Title

Untitled (work in lumber yard, Nigeria)

Date

1959

People

Artist: Ken Heyman, American born 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2011.549

Copyright

© Ken Heyman

Human Generated Data

Title

Untitled (work in lumber yard, Nigeria)

People

Artist: Ken Heyman, American born 1930

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Human 96.8
Person 94.2
Man 86.1
Face 58.8

Clarifai
created on 2018-02-10

people 99.8
one 99.6
adult 99.5
man 99.1
portrait 98.9
athlete 96.4
shirtless 91.1
wear 89
monochrome 88.4
sculpture 87.9
two 87.9
competition 84.3
statue 84.2
nude 81.8
son 81
street 80.7
art 80.5
boy 78.1
side view 77.7
city 74.9

Imagga
created on 2018-02-10

man 32.9
statue 32.3
clothing 29.7
body 28.8
male 27.7
person 27.1
muscular 26.7
adult 26.5
model 26.4
portrait 22.6
fitness 22.6
sexy 22.5
black 22.4
muscle 22.1
covering 22
attractive 21
mask 20.7
strong 19.7
fashion 19.6
torso 19.5
chest 19.5
cap 19.5
people 19
headdress 16.9
guy 16.7
fit 16.6
human 16.5
healthy 15.7
garment 15.6
face 15.6
athletic 15.3
handsome 15.2
one 14.9
macho 14.7
looking 14.4
abs 13.8
athlete 13.6
pose 13.6
dark 13.4
posing 13.3
lifestyle 12.3
hat 12.3
health 11.8
exercise 11.8
naked 11.6
skin 11.2
strength 11.2
bodybuilder 11.1
abdominal 11
sculpture 10.8
sensuality 10
art 9.8
abdomen 9.8
muscles 9.8
nude 9.7
arm 9.5
men 9.4
sports 9.2
power 9.2
sport 8.9
jeans 8.6
studio 8.4
slim 8.3
style 8.2
lady 8.1
metal 8
water 8
hair 7.9
hunk 7.9
pretty 7.7
hand 7.6
build 7.6
holding 7.4
figure 7.3
alone 7.3
sensual 7.3

Google
created on 2018-02-10

man 93.2
black and white 89.7
standing 84.5
male 79.8
sculpture 79.6
monochrome photography 75.5
arm 72.6
muscle 72
classical sculpture 67.5
monochrome 67.4
metal 63.7
statue 60
material 59.7
chest 57
trunk 55.7
neck 50.2

Microsoft
created on 2018-02-10

person 99.6
sculpture 57.2

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 57.7%
Happy 1.2%
Disgusted 1.4%
Surprised 0.8%
Confused 1.7%
Calm 80.3%
Sad 11.6%
Angry 2.9%

Feature analysis

Amazon

Person 94.2%

Captions

Microsoft

a person wearing a costume 69.2%
a close up of a person wearing a costume 62.2%
a person standing in front of a building 44.4%