Human Generated Data

Title

Untitled (two soldiers seated on ground talking, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.141.5

Human Generated Data

Title

Untitled (two soldiers seated on ground talking, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.141.5

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Apparel 99.7
Clothing 99.7
Human 98.5
Person 98.5
Face 96.4
Smile 90.5
Hat 77.6
Female 76
Coat 75.2
Person 74.1
Plant 71.8
People 67.1
Woman 64.4
Advertisement 64
Poster 64
Photography 62
Photo 62
Costume 57.9
Glasses 57
Accessories 57
Accessory 57

Clarifai
created on 2023-10-22

people 99.6
portrait 98.4
monochrome 98.2
adult 97
one 95.8
military 95.7
woman 95.3
war 94
wear 93.5
man 92.7
girl 90.9
uniform 90.9
soldier 90.9
lid 89.9
outfit 88.8
two 88
vintage 87.2
retro 85.6
gun 84.7
child 83.2

Imagga
created on 2021-12-14

statue 33.6
person 26.3
military uniform 26.1
uniform 24.8
clothing 24.4
people 21.2
man 20.1
sculpture 18.6
hat 16.8
adult 16.2
old 16
religion 15.2
outdoors 14.9
male 14.9
covering 13.6
outdoor 13
consumer goods 12.4
one 11.9
scholar 11.6
history 11.6
sky 11.6
religious 11.2
groom 11
architecture 10.9
face 10.6
art 10.5
ancient 10.4
portrait 10.3
hair 10.3
stone 10.2
blond 10
child 9.9
travel 9.9
soldier 9.8
black 9.7
love 9.5
men 9.4
monument 9.3
intellectual 9.3
city 9.1
protection 9.1
park 9.1
private 9
world 9
catholic 8.7
outside 8.6
horse 8.5
dress 8.1
looking 8
god 7.6
hand 7.6
fashion 7.5
dark 7.5
tourism 7.4
symbol 7.4
girls 7.3
building 7.1
posing 7.1
summer 7.1
autumn 7
antique 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.2
human face 96.5
outdoor 89.6
person 86.2
clothing 83.4
black and white 80.9
statue 73.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-39
Gender Male, 80.7%
Calm 48.2%
Angry 34.8%
Disgusted 3.9%
Confused 3.6%
Sad 3.2%
Surprised 2.6%
Happy 2.4%
Fear 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Hat 77.6%

Categories

Captions

Microsoft
created on 2021-12-14

an old photo of a person 65%
an old photo of a person 64.9%
an old photo of a girl 47.1%