Human Generated Data

Title

Untitled (profile of soldier, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.174.3

Human Generated Data

Title

Untitled (profile of soldier, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.174.3

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Head 100
Face 98.4
Human 98.4
Person 92.2
Hair 68.1
Portrait 68
Photography 68
Photo 68
Finger 59.2
LCD Screen 59
Electronics 59
Monitor 59
Display 59
Screen 59
Lip 56.6
Mouth 56.6

Clarifai
created on 2023-10-22

portrait 99.5
people 99.1
man 99
monochrome 98.3
face 98.1
one 98
dark 96.3
profile 95
eye 94.8
adult 94.5
head 93.2
model 92.5
light 91.1
self 89.8
bald 89.6
studio 88.9
serious 88.9
street 88.9
sadness 88.6
side view 88.6

Imagga
created on 2021-12-14

beard 35
man 35
face 31.3
person 30.4
hat 29.7
male 27.7
portrait 26.5
model 24.1
people 22.3
black 21.9
adult 21.3
headdress 20.3
head 19.3
ruler 18.8
guy 17.8
human 17.3
eyes 17.2
clothing 15.7
sculpture 15.6
close 15.4
handsome 15.2
dark 15
cap 14.9
attractive 14.7
expression 14.5
looking 14.4
body 14.4
eye 14.3
hair 14.3
cowboy hat 13.9
men 13.7
sexy 13.7
one 13.4
nose 12.7
serious 12.4
skin 12
old 11.8
boy 11.4
smile 11.4
fashion 11.3
muscular 10.5
bathing cap 10.5
art 10.2
emotion 10.1
statue 9.7
look 9.6
bust 9.6
senior 9.4
strong 9.4
covering 9.3
lips 9.3
macho 9.2
health 9
healthy 8.8
masculine 8.8
lifestyle 8.7
mouth 8.6
hand 8.5
happy 8.1
blond 8.1
closeup 8.1
torso 7.8
muscle 7.7
consumer goods 7.6
gesture 7.6
studio 7.6
shirt 7.5
smoke 7.4
child 7.4
fit 7.4
posing 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

man 99.6
person 99.5
indoor 98.3
human face 98.1
text 78.6
face 71.6
portrait 52
staring 21.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-42
Gender Female, 89.2%
Angry 67%
Happy 23.3%
Calm 4.9%
Sad 2.4%
Fear 1.3%
Surprised 0.4%
Confused 0.4%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 92.2%

Categories

Captions

Microsoft
created on 2021-12-14

a man looking at the camera 76.7%
a screen shot of a man 76.6%
a man in a dark room 75%