Human Generated Data

Title

Untitled (two soldiers stopped on trail to talk, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.138.4

Human Generated Data

Title

Untitled (two soldiers stopped on trail to talk, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 99.3
Person 99.3
Person 89.7
Soil 79.6
Face 78
Military Uniform 77
Military 77
People 74
Soldier 71.6
Apparel 66.7
Clothing 66.7
Army 65
Armored 65
Archaeology 59.3
Female 58.6
Outdoors 56.6
Coat 55.1
Overcoat 55.1
Suit 55.1

Imagga
created on 2021-12-14

gravestone 49.1
memorial 40.4
stone 40.3
cemetery 32.6
structure 29.9
old 26.5
tree 20.6
architecture 18.7
newspaper 17.4
travel 16.9
ancient 16.4
product 16.3
building 14.5
grunge 14.5
vintage 14.1
sky 13.4
landscape 13.4
trees 13.3
autumn 13.2
forest 13.1
man 12.8
creation 12.5
dark 11.7
history 11.6
tourism 11.5
light 11.4
wall 11
outdoor 10.7
art 10.5
antique 10.4
season 10.1
historic 10.1
canvas tent 9.8
texture 9.7
culture 9.4
monument 9.3
peaceful 9.2
city 9.1
aged 9
religion 9
night 8.9
sun 8.9
rock 8.7
historical 8.5
window 8.2
park 8.2
fall 8.1
natural 8
scenic 7.9
scene 7.8
scary 7.7
fog 7.7
mystery 7.7
winter 7.7
religious 7.5
road 7.2
landmark 7.2
fantasy 7.2
scenery 7.2
black 7.2
rural 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

tree 98.8
outdoor 97
black and white 90.5
text 90.5
person 80.7
clothing 79.4
monochrome 61.9
statue 61.2

Face analysis

Amazon

Google

AWS Rekognition

Age 35-51
Gender Female, 61.1%
Calm 94.2%
Sad 2.9%
Happy 1.2%
Angry 0.7%
Confused 0.4%
Surprised 0.3%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 13-25
Gender Female, 53.3%
Sad 73.7%
Calm 17.8%
Happy 6.5%
Fear 1.1%
Angry 0.4%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a group of people standing next to a tree 59.9%
a group of people that are standing in the dirt 59.8%
a group of people standing in front of a tree 54.8%