Human Generated Data

Title

Untitled (helicopter blades with helicopters parked below in background, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.361.5

Human Generated Data

Title

Untitled (helicopter blades with helicopters parked below in background, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.361.5

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Face 100
Head 100
Photography 100
Portrait 100
Body Part 100
Finger 100
Hand 100
Clothing 99.9
T-Shirt 99.9
Person 99
Adult 99
Male 99
Man 99
Person 96.9
Adult 96.9
Male 96.9
Man 96.9
Person 92.4
Adult 92.4
Male 92.4
Man 92.4
Accessories 86.9
People 77.4
Outdoors 70.3
Transportation 65.8
Vehicle 65.8
Armored 65.2
Military 65.2
Tank 65.2
Weapon 65.2
Nature 61.8
Aircraft 57
Baseball Cap 56.7
Cap 56.7
Hat 56.7
Sunglasses 56.3
Worker 56
Arm 55.7
Glasses 55.6
Firearm 55.2
Gun 55.2
Rifle 55.2

Clarifai
created on 2018-10-07

people 99.7
one 99
adult 98.9
man 98.6
portrait 96
monochrome 95.9
woman 91.8
wear 90.6
nude 89.4
music 88.6
vehicle 85.9
furniture 83
side view 81.9
recreation 81.8
indoors 79.3
model 79.1
seat 78.5
two 77.9
musician 77.8
light 77.6

Imagga
created on 2018-10-07

windshield wiper 34.8
mechanical device 29.2
device 28
mechanism 22.4
people 14.5
work 14.1
business 14
black 13.2
hands 12.2
hand 12.1
man 12.1
person 12
pen 11.7
paper 11
laptop 10.7
vehicle 10.2
stringed instrument 10.2
computer 10
musical instrument 9.9
adult 9.8
working 9.7
closeup 9.4
equipment 9.3
male 9.2
old 9
human 9
technology 8.9
office 8.8
keyboard 8.8
education 8.7
sand 8.4
bowed stringed instrument 8.2
home 8
table 7.9
car 7.9
dark 7.5
landscape 7.4
light 7.3
lifestyle 7.2
body 7.2

Google
created on 2018-10-07

Microsoft
created on 2018-10-07

person 98.9
indoor 93.8
man 93.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 38-46
Gender Male, 64.3%
Calm 72.5%
Sad 26.8%
Surprised 6.5%
Fear 6.1%
Angry 4.1%
Confused 2.1%
Disgusted 0.5%
Happy 0.1%

Feature analysis

Amazon

Person 99%
Adult 99%
Male 99%
Man 99%

Captions

Microsoft
created on 2018-10-07

a man sitting on a table 67.7%
a man sitting at a table 67.6%
a man sitting in a room 67.5%

Text analysis

Amazon

S.A
ANTE