Human Generated Data

Title

Untitled (two soldiers standing outside helicopter, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.22.3

Human Generated Data

Title

Untitled (two soldiers standing outside helicopter, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.22.3

Machine Generated Data

Tags

Amazon
created on 2019-05-28

Person 99.2
Human 99.2
Person 96.6
Face 83.8
Hardware 81.4
Computer 81.4
Computer Hardware 81.4
Computer Keyboard 81.4
Keyboard 81.4
Electronics 81.4
Clothing 78.1
Apparel 78.1
Head 71.7
Finger 67.2
Outdoors 62.4
People 61.6

Clarifai
created on 2019-05-28

people 99.4
vehicle 97.3
adult 97.2
man 97.2
two 92.5
car 91.3
one 90.2
portrait 87.9
transportation system 87.2
woman 86.8
group 86.7
monochrome 86.2
wear 85.5
administration 84.4
indoors 82.8
three 82.7
war 81.5
facial expression 81.4
aircraft 81.4
street 81.2

Imagga
created on 2019-05-28

statue 52.6
man 26.2
sculpture 25.5
old 20.2
astronaut 19.6
male 19.1
art 19.1
person 18.9
people 17.8
religion 15.2
architecture 14.8
ancient 14.7
portrait 14.2
antique 13.8
historic 13.8
culture 13.7
history 12.5
uniform 12.1
face 12.1
black 12
soldier 11.7
religious 11.2
monument 11.2
city 10.8
mask 10.3
sky 10.2
safety 10.1
protection 10
adult 9.7
building 9.7
military 9.7
men 9.4
famous 9.3
stone 9.3
travel 9.2
one 9
machinist 8.8
army 8.8
symbol 8.8
traditional 8.3
vintage 8.3
clothing 8.1
job 8
work 7.8
engineer 7.8
catholic 7.8
marble 7.7
war 7.7
industry 7.7
god 7.7
musical instrument 7.5
tourism 7.4
decoration 7.4
occupation 7.3
industrial 7.3
metal 7.2
dirty 7.2
weapon 7.2
newspaper 7.1
device 7.1

Google
created on 2019-05-28

Microsoft
created on 2019-05-28

person 99.1
human face 96.5
clothing 96.4
man 94.9
black and white 77.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Male, 82.7%
Calm 30.1%
Confused 6.8%
Happy 1%
Disgusted 5.3%
Surprised 1.5%
Angry 5.9%
Sad 49.5%

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft
created on 2019-05-28

a man wearing a hat 59%
a man holding a gun 44%
a man wearing a costume 43.9%