Human Generated Data

Title

Untitled (soldiers and helicopter delivering supplies, central highlands near Dak To, Vietnam)

Date

1967

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.475

Human Generated Data

Title

Untitled (soldiers and helicopter delivering supplies, central highlands near Dak To, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.475

Machine Generated Data

Tags

Amazon
created on 2019-08-10

Human 86.4
Person 86.4
Person 84.1
Person 79
Person 76.3
Person 69.5
Advertisement 68.8
Poster 68.8
Text 68.7
Collage 57
Paper 55.1

Clarifai
created on 2019-08-10

negative 99.9
filmstrip 99.8
movie 99.6
exposed 99.5
cinematography 98.6
slide 98.6
photograph 98.1
collage 96.5
picture frame 95.9
emulsion 95.4
old 91.9
roll 90.4
people 88.6
desktop 88.4
antique 88.1
noisy 87.2
dirty 87
bobbin 86.5
security 84.7
retro 84.6

Imagga
created on 2019-08-10

screen 35.4
background 35.1
display 26.3
device 20
technology 17.1
business 15.8
modern 15.4
electronic device 14.1
electronic 14
black 13.8
equipment 13.7
design 13.5
billboard 12.1
structure 12.1
signboard 11.7
television 11.5
light 11.3
digital 11.3
art 11.1
sunglass 11
communication 10.9
button 10.6
sunglasses 10.5
computer 9.7
keyboard 9.7
information 9.7
new 9.7
control 9.5
system 9.5
corporate 9.4
industry 9.4
lights 9.3
data 9.1
pattern 8.9
locker 8.7
man 8.7
space 8.5
tech 8.5
spectacles 8.5
hand 8.3
city 8.3
entertainment 8.3
fastener 8
architecture 7.8
people 7.8
shop 7.8
glass 7.8
keypad 7.8
remote 7.7
numbers 7.7
office 7.6
person 7.4
graphic 7.3
window 7.3
music 7.2
financial 7.1
work 7.1

Google
created on 2019-08-10

Microsoft
created on 2019-08-10

screenshot 99.6
text 98.2
poster 57.5
person 50.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 28-44
Gender Male, 54.5%
Sad 45.5%
Happy 45%
Angry 45.6%
Fear 45.1%
Confused 45.2%
Disgusted 45.1%
Calm 53.1%
Surprised 45.4%

AWS Rekognition

Age 21-33
Gender Male, 53.5%
Disgusted 45.1%
Fear 45.1%
Sad 45.2%
Angry 45.8%
Surprised 45.4%
Confused 45.1%
Happy 45.1%
Calm 53.3%

AWS Rekognition

Age 34-50
Gender Male, 50.4%
Calm 49.6%
Confused 49.5%
Surprised 49.6%
Sad 49.7%
Happy 49.6%
Fear 49.7%
Disgusted 49.6%
Angry 49.7%

AWS Rekognition

Age 1-7
Gender Male, 50.4%
Sad 50.5%
Angry 49.5%
Happy 49.5%
Disgusted 49.5%
Confused 49.5%
Fear 49.5%
Surprised 49.5%
Calm 49.5%

Feature analysis

Amazon

Person 86.4%

Categories

Imagga

interior objects 97.7%
food drinks 1.4%

Captions

Microsoft
created on 2019-08-10

a group of people in a room 42.4%

Text analysis

Amazon

20
E KTACHROME
6l