Researchers say using technology to detect behavioral patterns could help federal managers screen out mischief-makers.
The term “insider threat” describes everything from government employees who snap on the job and commit violence to those who leak national secrets.
But researchers say using technology to detect otherwise hidden behavioral patterns could help federal managers screen out mischief-makers of all stripes.
Moreover, they could do so within the bounds of privacy.
The profiling unit of the FBI popularized by the procedural drama “Criminal Minds” has begun a multiyear project studying how technology can help identify insider threats.
"Is there a similarity between the person who is putting a logic bomb on your network and the person who is going to throw a bomb in your office?” said Supervisory Special Agent Kevin Burton of the FBI Cyber Behavioral Analysis Center. “Are the behaviors different? Do they intersect?"
A "logic bomb" of malicious code wiped computers at Sony Pictures in a recent hack.
“It is a holistic, intercommunity approach,” Burton said of the agency's research.
Some Scenarios Belie Tech Fixes
To prevent data breaches, managers typically use computer access controls. This includes requiring employees to log in with a password and smart card as well as blocking the ability to download files.
But other situations may belie simple tech fixes.
How do you stop a Transportation Security Administration officer, for example, from messing with the no-fly list? That’s not an unrealistic scenario. A British immigration officer reportedly barred his wife from returning home from Pakistan for three years by placing her on a terrorist watch list.
Ultimately, human beings, not computers, are behind cyber or physical breaches. And human behavior can be studied, researchers say.
"Traditionally, this problem has pretty much been dealt with in the IT shop as an issue," Burton said. Now, supervisors also are probing the huge digital footprints everyone leaves behind, including credit score changes and speeding ticket records, to spot an employee about to harm the government.
"If there's a financial issue, if there is a relationship issue -- it doesn’t mean that that in and of itself makes that person an insider threat," Burton said. But, if "you see that there is psychosocial behavioral issue" that needs addressing, bosses should loop in employee assistance program personnel, privacy staff and perhaps legal counsel to assist the troubled individual, he said.
Big Data Cuts Both Ways for Task Force
Burton was participating as an audience member at a forum on government insider threats hosted by Nextgov on Wednesday.
He echoed points made by panelist Patricia Larsen, co-director of the National Insider Threat Task Force. The Obama administration created the office after former soldier Chelsea Manning shared U.S. secrets with Wikileaks.
In a world where nearly every classified government activity is digitized, there is more material for hackers to target, Larsen noted.
But big data also lets agencies be smarter about who they keep tabs on.
"It cuts in both directions," she said. "I also have a lot more information about people.”
She added, “Some of our tools and techniques that we are trying to explore are making use of all the data we have about people about common patterns to our benefit so we can decide what’s anomalous and what’s not.”
What 'World of Warcraft' Can Teach the Intelligence Community
Academics already have tested this theory on video gamers.
A 2012 paper by the Palo Alto Research Center demonstrates how insider threats can be identified through the personality profiling of players and their in-game social networks. The experiment was performed on more than 50,000 players of the online game "World of Warcraft" over a six-month period.
Researchers guessed which players would quit their "guilds,” possibly causing damage to these player associations, based on their in-game behavior. Among the 68 actions assessed were players’ achievements in reaching major milestones in the game and manner of the characters’ deaths.
Gamer nicknames also factored into predictions.
Researchers wrote: “A player usually picks a character name appealing to him/her after some thinking. Guild names are also chosen carefully to reflect the intended 'social tone' of the group (for instance, ‘Merciless Killers’ conveys a different impression than ‘The Merry Bards’)."
So, one can “analyze character names and the name of a character’s guild to get additional information on the player’s personality.”
Still, there are limits to the prognostication for now.
The most high-tech option at this point may just be an alert boss or colleague who realizes an individual is having a problem to recognize a potential threat, Larsen said.
“No one particular thing is going to say, 'that’s it – he’s a spy,' ” Larsen said.