Human Generated Data

Title

Untitled (USO show, Long Binh Post, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.290.5

Human Generated Data

Title

Untitled (USO show, Long Binh Post, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.290.5

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Person 99.5
Human 99.5
Person 98.8
Person 98.8
Person 93.5
Crowd 88.9
Audience 87.2
Person 86.3
Person 85
Person 84.8
Indoors 69.8
Room 69.8
Person 68.7
Stage 63
People 60.3
Person 60.3
Person 60.2
Advertisement 59.2
Poster 59.2
LCD Screen 57.6
Electronics 57.6
Display 57.6
Monitor 57.6
Screen 57.6
Person 48.7
Person 44.4

Clarifai
created on 2019-08-09

people 99.8
adult 98.6
monochrome 97
group 97
many 95.9
man 94.9
woman 92.3
crowd 88.8
war 87
one 85.5
music 85
indoors 84.9
wear 84.2
group together 83.2
child 82.9
room 80.1
no person 77.1
interaction 76.2
military 73.6
furniture 71.4

Imagga
created on 2019-08-09

jigsaw puzzle 42.4
city 38.2
puzzle 33.5
architecture 32
game 23.9
building 23.6
cityscape 21.8
travel 21.1
town 20.4
tower 19.7
old 18.1
urban 17.5
river 17.2
history 17
landscape 16.4
buildings 16.1
water 15.3
skyline 15.2
structure 15.1
sky 14
scene 13.8
landmark 13.5
grunge 12.8
tourism 12.4
bridge 12.3
ancient 11.2
vintage 11
fan 10.7
retro 10.6
house 10.1
antique 9.8
texture 9.7
winter 9.4
church 9.2
art 8.8
aerial 8.7
skyscraper 8.6
follower 8.6
historical 8.5
historic 8.2
new 8.1
religion 8.1
room 8
day 7.8
castle 7.8
sepia 7.8
tree 7.7
hall 7.6
panorama 7.6
famous 7.4
palace 7.4
light 7.3
person 7.3
aged 7.2
black 7.2
modern 7

Google
created on 2019-08-09

Microsoft
created on 2019-08-09

indoor 94.8
person 85.6
text 76
black and white 69.5
clothing 64.6
decorated 57.6
crowd 2.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 42-60
Gender Male, 50.2%
Happy 49.9%
Disgusted 49.5%
Confused 49.5%
Sad 49.8%
Fear 49.5%
Surprised 49.5%
Angry 49.5%
Calm 49.7%

AWS Rekognition

Age 37-55
Gender Male, 50%
Sad 50.4%
Confused 49.5%
Surprised 49.5%
Disgusted 49.5%
Angry 49.5%
Happy 49.5%
Fear 49.5%
Calm 49.5%

AWS Rekognition

Age 8-18
Gender Female, 50%
Calm 49.7%
Confused 49.8%
Sad 49.7%
Disgusted 49.5%
Happy 49.7%
Angry 49.5%
Fear 49.6%
Surprised 49.5%

AWS Rekognition

Age 13-23
Gender Female, 50%
Angry 49.6%
Confused 49.5%
Happy 49.5%
Disgusted 49.5%
Calm 49.6%
Fear 49.5%
Sad 50.3%
Surprised 49.5%

AWS Rekognition

Age 64-78
Gender Male, 50.4%
Fear 49.5%
Calm 49.6%
Happy 49.5%
Confused 49.9%
Disgusted 49.5%
Angry 49.5%
Surprised 49.5%
Sad 49.9%

Feature analysis

Amazon

Person 99.5%
Poster 59.2%

Categories

Captions

Microsoft
created on 2019-08-09

a group of people in a room 53.5%
a group of people around each other 26.9%

Text analysis

Amazon

BINH
SHOWS
USO
RVN
ARMY RVN
ARMY
LONG BINH
LONG
ar
t USO ar
t

Google

ARMY RVN LONG BINH USO SHOWS
ARMY
RVN
LONG
BINH
USO
SHOWS