Human Generated Data

Title

Untitled (wounded soldiers being loaded onto helicopter, Vietnam)

Date

1967, printed later

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3841

Human Generated Data

Title

Untitled (wounded soldiers being loaded onto helicopter, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3841

Machine Generated Data

Tags

Amazon
created on 2022-06-03

Person 99.6
Human 99.6
Person 98.3
Military 96.2
Helmet 95.9
Clothing 95.9
Apparel 95.9
Face 95.4
Military Uniform 94.2
Army 92.7
Armored 92.7
People 87.9
Helmet 87
Person 86.3
Mammal 81
Animal 81
Canine 81
Pet 81
Dog 81
Soldier 76.3
Head 74.8
Portrait 69.7
Photography 69.7
Photo 69.7
Person 69.4
Crowd 59.1
Troop 58.4

Clarifai
created on 2023-10-30

people 99.9
adult 99.2
group together 99.1
military 98.2
man 96.9
war 96.7
group 95.4
soldier 94.7
skirmish 93.5
woman 92.1
vehicle 90.3
wear 90.1
many 90.1
administration 89.4
reclining 87.5
two 85.5
outfit 85.5
several 84.8
recreation 83.5
military uniform 82.7

Imagga
created on 2022-06-03

man 28.9
person 25.5
percussion instrument 22.9
people 22.9
male 22.1
musical instrument 20.3
adult 19.4
drum 17.2
portrait 16.8
face 14.2
steel drum 13.6
men 12.9
human 12.7
mask 12.5
lifestyle 12.3
hair 11.9
hand 11.4
hands 11.3
sexy 11.2
body 11.2
world 11.1
industry 11.1
girls 10.9
industrial 10.9
relaxation 10.9
care 10.7
worker 10.7
women 10.3
love 10.3
black 10.2
work 10.2
skin 10.2
model 10.1
head 10.1
device 10
metal 9.7
salon 9.5
outside 9.4
outdoors 9.1
attractive 9.1
health 9
spa 9
sitting 8.6
smile 8.5
horse 8.5
happy 8.1
looking 8
smiling 8
working 8
child 7.9
factory 7.7
pretty 7.7
youth 7.7
machine 7.6
one 7.5
safety 7.4
occupation 7.3
guy 7.3
equipment 7.2
steel 7.1

Microsoft
created on 2022-06-03

person 99.9
outdoor 96.8
clothing 88.8
black and white 86.3
human face 81.7
text 79.8
man 77
people 63.4
crowd 0.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Male, 99.6%
Calm 57.6%
Disgusted 17.3%
Confused 13.5%
Surprised 7.9%
Fear 6.9%
Angry 4.4%
Sad 2.4%
Happy 1%

AWS Rekognition

Age 21-29
Gender Male, 99.9%
Sad 100%
Surprised 6.5%
Fear 6.2%
Calm 5.2%
Confused 4.2%
Angry 3.1%
Disgusted 1.3%
Happy 0.4%

AWS Rekognition

Age 22-30
Gender Male, 99.9%
Sad 98.6%
Surprised 20.6%
Calm 17.8%
Happy 6.2%
Fear 6%
Angry 1.8%
Disgusted 0.9%
Confused 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Helmet 95.9%
Dog 81%