Human Generated Data

Title

Untitled (medevac pilot SP5 Herbert C. Donaldson wiping eyes following rescue, Vietnam)

Date

1967

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.1.146

Human Generated Data

Title

Untitled (medevac pilot SP5 Herbert C. Donaldson wiping eyes following rescue, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.1.146

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Helmet 97.9
Apparel 97.9
Clothing 97.9
Person 95.2
Human 95.2
Hat 89.8
Finger 83.4
Leisure Activities 75.2
Musical Instrument 63.4

Clarifai
created on 2019-03-22

man 99.3
portrait 99.2
monochrome 98.6
people 98.3
adult 97.1
guy 97.1
one 96.8
lid 95
face 93.6
fashion 93
model 92.3
black and white 92.2
dark 90.7
smoke 89
fine-looking 88
sepia 87.5
indoors 86.6
boy 86.4
studio 85.9
serious 84

Imagga
created on 2019-03-22

person 38.8
seat belt 36.5
face 32.7
portrait 31.7
adult 29.8
safety belt 29.2
hat 27.8
hair 25.4
attractive 24.5
black 24.3
smile 24.2
restraint 23
people 22.9
eyes 22.4
pretty 21.7
fashion 20.4
blond 18.1
model 17.9
happy 17.6
child 17.5
expression 17.1
clothing 16.4
eye 16.1
looking 16
smiling 15.9
lady 15.4
look 14.9
call 14.6
headdress 14.3
man 14.1
cowboy hat 14.1
girls 13.7
make 13.6
car 12.9
sexy 12.9
one 12.7
head 12.6
studio 12.2
male 11.9
women 11.9
bride 11.5
beard 11.5
cute 11.5
human 11.3
modern 11.2
youth 11.1
close 10.9
dark 10
dress 9.9
hand 9.9
business 9.7
device 9.6
brunette 9.6
elegance 9.2
makeup 9.2
long 9.2
sensuality 9.1
holding 9.1
style 8.9
boy 8.7
automobile 8.6
happiness 8.6
sitting 8.6
skin 8.6
serious 8.6
mouth 8.5
phone 8.3
aviator 8.3
wedding 8.3
businesswoman 8.2
cheerful 8.1
handsome 8
kid 8
driver 7.8
facial 7.7
hairstyle 7.6
talking 7.6
females 7.6
lips 7.4
emotion 7.4
teenager 7.3
lifestyle 7.2

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

person 99.6
indoor 94.8
portrait 21.1
face 9.2
black and white 7.6

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 26-43
Gender Male, 94.8%
Disgusted 1.2%
Happy 0.6%
Calm 33%
Sad 54.3%
Angry 4.8%
Confused 4.5%
Surprised 1.5%

Microsoft Cognitive Services

Age 55
Gender Male

Feature analysis

Amazon

Helmet 97.9%
Person 95.2%
Hat 89.8%

Categories

Imagga

pets animals 46.3%
paintings art 36.2%
people portraits 16.8%

Captions

Microsoft
created on 2019-03-22

a close up of a person wearing a hat 72.3%