Written by Thomas Langer

 

In recognition of Insider Threat Awareness Month, we are pleased to have Charles S. Phalen Jr. join us for a wide-ranging discussion of the insider threat risk and the changes he has seen during his exemplary career in both the U.S. government and industry.  As many of you know, Mr. Phalen, now the Principal of CS Phalen & Associates, was most recently the Acting Director of the Defense Counterintelligence and Security Agency (DCSA) where he oversaw the evolution of this new agency when it merged with his prior agency, the National Background Investigations Bureau (NBIB) where he was the Director from its inception.

Prior to the NBIB, Mr. Phalen was Vice President, Corporate Security for Northrop Grumman Corporation and led the security organization, responsible for overseeing the security policies, procedures and processes that protect company employees, information and property until his retirement.

Prior to that, Mr. Phalen spent 30 years in the federal service. His government positions included Director of Security for the Central Intelligence Agency (CIA); Assistant Director, Security Division, Federal Bureau of Investigation; Chief, Protective Programs Group, CIA Office of Security; Executive Officer, CIA Office of Security; Center Chief, CIA Office of Facilities and Security Services; and Chief, Facilities and Information Security Division, National Reconnaissance Office. Previously, he worked in or managed security activities involving investigations, operations support, risk analysis, and facility and asset protection, in the United States and abroad.

Thomas Langer (TL): Charlie, thank you for supporting this blog and I know you’ll have some interesting perspectives given the years you have dedicated to this profession.  First, given this is Insider Threat Awareness Month for 2020, what challenges do you see to successful insider threat awareness and are there any systemic issues that still exist in government and industry?

Charles Phalen (CP): If there is any good news, I think building the general awareness that there is such a thing as an insider threat isn’t the challenge it once was.  Between broader ability to get the message out to relevant populations, and the increasing cadence of arrests being made, topped off with recurring (perhaps infuriating?) appearances of Edward Snowden, I think people know there is a problem.  A harder challenge is answering the logical questions from the average person; “So, how is this my problem to solve?” and “What are you going to do about it?”  We need to get beyond basic awareness and engage our constituent populations to accept some responsibility and participate actively.  If there are any systemic government or industry issues, I would identify the challenge of sharing relevant information in the employment environment which is understandably careful about individual privacy.

TL: What insider case first impacted you in your early CIA career and why?

CP:  We’ll have to go back further than that.  Not well reflected in my resume is the 8 years I spent in the private sector before joining the CIA.  I worked and managed in the security department of a large Washington area department store (now vanished like all of them from the 70s).  I learned a lot there: profit and loss, physical security, loss prevention, arrest and prosecution, management (maybe some by trial and error), and M&A.  A big piece was an introduction to employee trust – and betrayal.  One early case in particular grabbed my attention.  Each store had a cash office (you remember cash?) and the store in which I worked was experiencing some end-of-day shortages.  There was a lot of cash flowing through there as each sales register closed out and you can imagine the temptation to just grab some of it.  The night shift was mostly college-age kids (the usual suspects), but it turned out the seemingly least likely candidate – a 40-ish rock-solid night manager – was the actual perpetrator.  He wasn’t a guy I knew well but he had a lot of respect among his co-workers. The shock of this arrest greatly rattled the store manager and his executive team but in retrospect, the circumstances sound pretty familiar today: he had run into money problems, took a little at first, then got more comfortable with it and got in way over his head.  Paraphrasing him, “Yes, it was wrong, but it was so easy, and no one seemed to miss it, so I kept on.  I just didn’t think.”

That was a familiar refrain I heard over and over in that job.  Putting it into perspective, for about a year I was the coordinator of the employee theft program at the company.  It was an average year for us; we identified, terminated (and in many cases prosecuted) over 500 employees out of a work force of about 7000.  It puts a real strain on the concept of trust.

So, a few years later I show up at CIA and during one of the lengthy new employee training programs, I meet a lot of folks; some become friends and some friends-of-friends. One person in the latter group we’ll call Ed.  About 4 years into my career, a couple of my Office of Security colleagues came to see me, showed me his picture, and asked, “Do you know this guy?”  Um, yes, but not well.  Why?  Turns out it was Edward Lee Howard.  Again, I didn’t know him well but friends I trusted had trusted him and they were very upset.  He was on the front end of the infamous 1986 Year of the Spy, when we all began to scratch our heads over how to begin and continue to trust colleagues.  Unfortunately, we never got a chance to ask him how he felt about this since he escaped to Moscow before he could be arrested.

One big difference between these two experiences was the framing of the loss, which ultimately weighs on the risk equation of any security program.  In the retail world, the money or merchandise that is stolen is pretty easily replaceable.  You get more merchandise from the stock room, move in more cash, and if the losses get substantial, you consider raising prices.  Not so in the intelligence and defense world.  The people and capabilities – those things that give this nation an edge – that were compromised by Ed Howard, Rich Ames, all the way through and beyond Edward Snowden are not so easily replaceable.  You don’t just snap your fingers and recruit a new asset or develop a new collection system.

TL:  Although it’s been a number of years since the Aldrich Ames case at CIA, there are still lessons that come from that case.  I know you feel the same. Can you share what stuck with you about his betrayal and ultimate detection?

CP: Fortunately, my contact with him was pretty minimal, on the periphery of some events in the mid 80’s.  That said, the revelations of his betrayal reverberated across the Agency and for us in the Office of Security reamplified the lessons from the Year of the Spy.  From my vantage point, in terms of the loss equation, what he provided to the Soviet Union over several years badly damaged collection capability and resulted in loss of life of cooperative sources – the first hard to replace and the second irreplaceable.  It also brought out some of the common motivational themes  – a real or perceived serious financial challenge (a divorce settlement and a new spouse who liked to spend money), an angry response to lack of recognition in the work place (“I’m smart. You don’t respect how smart I am. I’ll show you!”), and of course access to information that he knew was of great value.  No idealization of communism or rejection of democracy, just a simple way to solve his problems (kind of like that department store cashier).  The nagging question in this and all the other cases was – and still is – were there any hints that something was not right? Well, yes, sort of, but the follow-up could have been better – and down the road it at least helped inform the development of some of the concepts of continuous evaluation.

TL: When you assumed the role of Assistant Director, Security Division, for the FBI, you did so after the arrest of FBI spy Robert Hanssen.  That must have been a difficult time given the extent of his betrayal and the need for an internal redesign of the agency’s approach to security.  What lessons from that translated into other parts of your professional life thereafter? 

CP:  Fortunately, FBI leadership recognized the problem and had already drawn up plans for pulling elements from about six different parts of the FBI into a Security Division.  When I arrived downtown, it had been recently established, so the next step was consolidating and focusing its mission responsibility, authority, and activities.  It covered – still covers I believe – the three main themes of any security program: Do I trust the people? Am I operating in a safe and secure physical environment?  Is my digital data secure and uncompromised?  Hanssen himself had misused IT systems, removed physical artifacts and he himself was fully compromised.  The key message from this event was the need for organizational elements to work together.  The FBI, from Director Mueller on down, knew and supported this and the employees of this new division were committed to meeting this mission need.  As we looked at things, I was reminded of a lesson learned (perhaps a little more painfully) in the 90s: the first line supervisors and middle managers are more often than not closest to both understanding a problem and coming up with a solution than any other echelon of the organization.  I took full advantage of that while at the FBI as well as at every job since.  This role also reinforced one of my strong beliefs regarding mergers: effective change does not arrive with a marching band and banners with slogans; instead it should arrive calmly, evolve steadily and not impede the day-to-day work.  The organization should focus on the job, focus on the mission, and the rest will generally take care of itself.  My final lesson from my FBI time was that it is all about relationships.  When the crisis hits, it is too late to figure out where help will come from.  Reach out and begin building the partnership early – and maintain them!

TL:  The background investigative process remains a critical component of providing the national security mission with the trusted workforce it requires. Since the inception of the NBIB, now merged with DCSA, how do feel that mission has improved and compliments the various insider threat programs?

CP: You are right – it is still a critical component of a trust decision, which is itself important since it is generally a human who causes the problem, either by commission or omission – or more colloquially, they are either evil or careless.  The actual process is under significant review right now in the Trusted Workforce 2.0 activity, but the basic premise will remain the same – create the baseline (the initial collection and adjudication) and then re-review to maintain that trust (periodic reinvestigation in the past and continuous vetting (CV) in the future).  The adjustment will come in two areas: First, review of what I need to know about a particular human as I make that trust decision and where can I reliably find it.  Second (and this is the bigger change), how can I reliably keep my eye on this person to track any critical changes that might affect my trust decision.  Today that is every 5 years; in a full up CV program, it is a lot sooner and aperiodic. Back to your original question: the DCSA investigations program has greatly enhanced (and continues to enhance) its electronic collection activity, including employment, education, finance, and criminal records where available.  This alone will greatly assist agencies which will need these information feeds as they build and mature their continuous vetting capabilities.  This also allows for the investigators to reduce (although not eliminate) their physical record search requirements, but more importantly, allows for them to focus their efforts and skills on those important human interactions with the subject of the investigation and with those who provide observations, context, clarifications, and other important non-record evaluations of the subject.

TL:  In many companies the Insider Threat Program Senior Official (ITPSO) is frequently the FSO or CSO depending on the size of the organization.  Given that so many cleared entities have limited security resources, what does an effective insider threat program look like? 

CP:  There is no one-size-fits-all organizational model to building an effective insider threat program.  A process that works for a large and focused defense contractor will not fit well in a small company.  The key is less about an org chart and more about how they pay attention to employees and commit to resolving problems.  Step one is understanding the landscape in the company – what information does it have about employee activities and behaviors, where does it exist, and how can we literally or virtually consolidate it.  Much of the useful information will be sourced inside the company in HR, performance reviews, security reports, financial data records.  A subset of that may be a recognition that there is data the company should collect but doesn’t – yet.  Second is the commitment to act: to recognize, evaluate and if necessary, mitigate relevant concerns that surface in the data collection.  Not even the big companies can follow up on everything.  Key will be triage to focus on significant single issues and to see when multiple smaller issues are starting to build into a bigger problem.  Finally, when you start doing all this, you will see things you never saw before about your employees.  Remember this isn’t about catching spies.  When they get that far, it is already too late.  It is about finding problem employees earlier in the chain and either helping them back on track (which actually works in many cases) or offloading them because their values don’t match the company’s.  It is critical to simply pay attention – and act.

TL:  I know I am dating myself here, but the advances in technology allow for data to be consolidated and miniaturized to the point that one can steal far more than ever before with less risk of detection. How important is it for the partnership between Security and IT?

CP:  Good news is that the consolidation of information provides great benefits – not just reducing storage space – but more importantly to allow for discoverability and utility of critical data.  The concept of “Need To Know” has given way to “Duty To Share.”  The technology has merged with a real need to share data quickly and easily, particularly with the information – much of it classified – that gives this country a tactical or strategic edge.  More simply, there’s no point in collecting information if it can’t be seen or used by those who need it.  (I have much longer stories to share over a beer.)  So, mostly all good, but share with who?  That line gets very blurred in a Duty To Share environment.  At its extreme, did anyone really envision or intend that a PFC at a forward operating base in Iraq should be able to read the private correspondence between the Ambassador in the Vatican and the Secretary of State?

Let me add another dimension.  The case I refer to above and several others resulted in significant volume of information leaving the house and we are both impressed and sobered by that volume.  With that said, in all the volume, there is often a significant fact (or maybe a couple) that by itself results in the most damage.  How a collection capability is done is less important than the simple knowledge that it can – and is being – done, and the exposure of that fact is crippling to ongoing operations.  This was a problem with Ames, Hanssen, Kampiles, and Howard in the “old days” and it hasn’t gone away.

So, to your question, yes, it is really important that the IT and Security organizations are tightly linked, starting with the CIO and the head of Security, and that link needs to be both philosophical and tactical. In today’s IT environment, a lot more people will have access to tons of data and that simple significant fact.  I don’t have a magic answer, but this confluence of possibilities emphasizes the challenge of continuous trust of humans, the need for reasonable controls on networks and the necessity for that partnership between IT and Security.

TL:  Any final thoughts?

CP:  This is not a problem that will simply evaporate.  Best quote on this:  Paul Redmond, a long time counterintelligence leader at CIA, conducted the Robert Hanssen damage assessment.  Shortly after I arrived at the FBI, I joined him as he made the rounds briefing committees on the Hill.  At the end of one briefing, a senator leaned over and asked Paul if he thinks there are any more spies in the U.S. government.  Paul, without missing a beat, replied, “Sir, I have been doing this for a number of years.  I’d say it is a statistical certainty.”   I agreed, and still agree.  We have our work cut out for us.

 

 

 


About the Author

Mr. Thomas Langer has a 30-year track record as an industry security executive, including 20 years with BAE Systems, and will be periodically sharing his knowledge on crucial, relevant topics here on this Blog page.  Learn more about Thomas here.