Human Generated Data

Title

Untitled (soldier reading, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.70.1

Human Generated Data

Title

Untitled (soldier reading, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.70.1

Machine Generated Data

Tags

Amazon
created on 2019-05-28

Human 98.2
Person 98.2
Clothing 96.1
Apparel 96.1
Finger 71.9
Sleeve 71.8
Face 63.3
Photo 60.9
Photography 60.9
Furniture 58.6
Leisure Activities 57.4
Pants 56.3

Clarifai
created on 2019-05-28

people 99.8
monochrome 97.9
man 96.8
adult 96.2
one 95.8
woman 93.4
wear 93.3
two 92.3
street 91
music 80.4
wedding 77.6
elderly 76.6
administration 76.4
child 75
indoors 71.8
vehicle 70.8
war 69.1
military 69
recreation 67.9
analogue 65.6

Imagga
created on 2019-05-28

black 29.1
person 26.9
man 25.5
adult 24.1
people 24
male 22.1
portrait 20.1
dress 19
sexy 18.5
fashion 17.3
musical instrument 16.1
attractive 15.4
hair 14.3
face 13.5
human 13.5
model 13.2
suit 13.2
lady 12.2
stringed instrument 12.1
hand 11.4
clothing 11
garment 10.7
pretty 10.5
art 10.4
style 10.4
music 10
costume 9.9
love 9.5
musician 9.2
dark 9.2
studio 9.1
instrument 9.1
vintage 9.1
sensuality 9.1
covering 9.1
worker 9
metal 8.9
working 8.8
professional 8.8
serious 8.6
elegant 8.6
expression 8.5
cloak 8.5
hot 8.4
guy 8.3
holding 8.3
religion 8.1
handsome 8
light 8
posing 8
body 8
job 8
device 7.9
brunette 7.8
play 7.8
men 7.7
prayer 7.7
bowed stringed instrument 7.7
old 7.7
world 7.6
passion 7.5
entertainment 7.4
guitar 7.3
work 7.1
lovely 7.1

Google
created on 2019-05-28

Microsoft
created on 2019-05-28

indoor 94.4
black and white 93.2
person 91.5
monochrome 79
concert 78.5
clothing 75.2
dark 35.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 11-18
Gender Female, 73.8%
Happy 5.4%
Confused 1.8%
Disgusted 3%
Surprised 1.9%
Sad 66.3%
Angry 6.3%
Calm 15.4%

Feature analysis

Amazon

Person 98.2%

Captions

Microsoft
created on 2019-05-28

a man in a dark room 73.8%
a man standing in a dark room 70.9%
a man sitting in a dark room 58.9%