Human Generated Data

Title

Untitled (soldiers trading supplies and tags, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.164.2

Human Generated Data

Title

Untitled (soldiers trading supplies and tags, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.164.2

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.4
Helmet 99.4
Apparel 99.4
Person 98.2
Human 98.2
Person 93.4
Astronaut 80.7
Helmet 71.7
Hat 65.5

Clarifai
created on 2023-10-22

people 99.7
monochrome 97.9
man 96.4
adult 96.3
two 95.6
wear 91.6
group together 91.1
three 88.4
group 87.1
war 86.6
outfit 86.5
lid 85.5
street 85.3
military 84.3
woman 83.2
uniform 82.7
portrait 82.5
veil 80.5
one 78.6
collage 76.4

Imagga
created on 2021-12-14

man 26.2
person 23.6
uniform 22.5
male 22
people 20.6
clothing 17.9
war 17.3
weapon 16.2
black 15.6
device 15.5
military 15.4
astronaut 15.3
work 14.9
soldier 14.7
mask 14.6
men 12.9
human 12.7
equipment 12.6
instrument 12.5
adult 11.7
gun 11.7
nurse 11.7
worker 11.7
covering 11.6
hand 11.4
protection 10.9
military uniform 10.7
army 10.7
professional 10.4
safety 10.1
camouflage 10.1
music 9.9
art 9.8
battle 9.8
warrior 9.8
surgeon 9.4
armor 9.4
guy 9.2
portrait 9.1
paint 9.1
newspaper 9
metal 8.8
armed 8.8
body 8.8
mechanic 8.8
helmet 8.8
rock 8.7
engine 8.7
old 8.4
fashion 8.3
occupation 8.2
playing 8.2
danger 8.2
history 8
game 8
player 8
medical 7.9
product 7.9
forces 7.9
conflict 7.8
face 7.8
model 7.8
play 7.8
doctor 7.5
consumer goods 7.5
fun 7.5
technology 7.4
hospital 7.4
vehicle 7.4
business 7.3
breastplate 7.2
megaphone 7.1
job 7.1
working 7.1
musical instrument 7.1
medicine 7

Microsoft
created on 2021-12-14

person 99.3
text 96.4
outdoor 92
black and white 80.4
clothing 74.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-37
Gender Male, 85.1%
Calm 99.3%
Surprised 0.3%
Sad 0.2%
Angry 0.1%
Fear 0.1%
Happy 0.1%
Confused 0.1%
Disgusted 0%

AWS Rekognition

Age 23-35
Gender Male, 79.1%
Calm 87.2%
Fear 4.9%
Angry 4.3%
Surprised 1.5%
Happy 0.9%
Sad 0.7%
Disgusted 0.4%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Helmet 99.4%
Person 98.2%
Hat 65.5%

Categories

Imagga

paintings art 62.9%
people portraits 21.1%
pets animals 11.9%
food drinks 1.1%

Text analysis

Amazon

ALFA
S