recommended reading

What 18F and Other Rule Breakers Teach Us About Insider Threats

Andrea Danti/Shutterstock.com

ARCHIVES

By Dan Velez April 5, 2017

recent posts

Dan Velez is director of insider threat operations at Forcepoint.

They were the IT equivalent of the “Pros from Dover”—a collection of highly skilled tech specialists who steered federal agencies into the fast lane of digital modernization. They were talented. They were empowered. And they were insider threats.

Operating within the General Services Administration, they were known as 18F, a “civic consultancy for government … built in the spirit of America’s top tech startups.” Among other accomplishments, 18F launched Cloud.gov; worked with the Federal Election Commission to make campaign finance data more accessible to the public; and created an open-source government “developer services marketplace.” The marketplace took advantage of private sector agile tech to accelerate the acquisition and deployment of the latest and greatest mission-supporting apps.

While so much went right with 18F, something went wrong: Its members broke a few rules along the way—rules intended to protect IT systems and data from threats and compromises. Indeed, the Office of Inspector General Office of Inspections and Forensic Auditing has concluded 18F “routinely disregarded and circumvented fundamental GSA information security policies and guidelines,” according to its Feb. 21 report titled “Evaluation of 18F’s Information Technology Security Compliance.”

In a previous evaluation of 18F conducted in May 2016, the OIG found unit’s widespread use of Slack—the popular online messaging and collaboration app—potentially exposed sensitive information over a 5-month period and led to a data breach in March. The new report is intended as a follow-up, one that reveals a number of additional practices which reflect classic patterns of an insider threat. These developments align with what is increasingly recognized as traits of convenience seeker, know-it-all and entitled insider threats:

  • No less than 27 members of the 18F team frequently used personal email accounts to send work-related emails, without copying or forwarding the messages to the employees’ official GSA email accounts as required.
  • 18F contracted for more than $24.8 million in IT acquisitions last year without any review or approval of the GSA’s chief information officer. This violates a June 2015 memorandum of agreement that established 18F’s funding source for operations, which states that, in accordance with the Federal Information Technology Acquisition Reform Act, agency CIOs must play a significant role in the acquisition and oversight of IT.
  • A senior 18F member improperly appointed himself as an information systems security officer for the unit to create its own security assessment and authorization processes, to circumvent GSA officials who were assigned to issue such approvals.
  • None of 18F’s 18 IT systems were authorized to operate in the GSA IT environment. At least two systems—including Google drives possibly exposed on Slack—contained personally identifiable information. In addition, 100 of the 116 software apps used by 18F—including Hackpad, a collaboration tool; Pingdom, a website monitoring product; and Hootsuite, a social media marketing and management dashboard—were not approved for deployment in the GSA IT environment. What’s more, they were never submitted for approval.

On the positive side, such practices—the subject of much scrutiny once cast into the spotlight—often encourage self-examination and introspection among agency leaders. After all, 18F is hardly a unique entity; similar examples of hubris emerge often within organizations today. However, to help ensure they don’t, here are five lessons we can learn from 18F about insider threats and the need to understand the behavior of people handling your organization’s data and intellectual property:

Cybersecurity isn’t a tech problem. It’s a human problem. And it begins and ends with insider threats. 18F is a classic case of privileged users who decided they were so busy innovating, the rules didn’t apply to them. The Defense Human Resources Activity office describes these kinds of users as grandiose narcissists, or who “feel they are so smart or so important that the rules, which were made for ordinary people, do not apply to them. Rules and social values are not necessarily rejected as they are by the antisocial personality; it is just that one feels above the rules.”

Without any meaningful checks on its behavior on the part of a chief information security officer or a security team with authority, 18F did as it pleased. This type of enablement only encourages the potentially destructive culture to spread. Once nonprivileged users observe what the hotshots get away with, they feel it’s perfectly acceptable to do the same. If they lack understanding behavior and respect for the rules of information assurance, agencies fall in danger of cultivating a culture of compromise.

It starts at the top. It may sound like a cliché, but leaders must appreciate that “do as I say and as I do” matters greatly. They have to clearly articulate insider threat-focused rules which dictate the safeguarding of systems and data. They need to lead by example in following the rules, while convincing managers and users at all levels that insider-threat policies and practices directly support critical mission objectives. Then, they should support ongoing training and awareness opportunities to introduce new best practices about the identification and response to potential insider-threat incidents, and reaffirm old ones.

At the GSA, this quality of leadership appears to be missing at the moment, as its Technology Transformation Service Commissioner Rob Cook recently made excuses for 18F by saying, “We will be pressing up very hard against these rules because we believe they are stifling … our ability to be agile, to use technology in an innovative way to deliver the best services.”

Believing rules stifle agile innovation shows a lack of understanding about insider-threat programs and the need to examine the intent of those insiders. It’s not an either/or outcome, as in you either get to innovate while jeopardizing data, or you lock everything down and achieve far less. That simply isn’t the case at all, especially when…

Risk management brings clarity. There’s never enough discussion about the value of a well-conceived risk management plan—one that doesn’t serve as the department of no Cook alludes to. An effective risk-management plan reasonably considers rewards and risks to strike a proper balance between information assurance and the need to get business done.

The National Institute of Standards and Technology endorses this approach. While its tech authorization and usage requirements once conveyed a rigid, checklist-like tone, NIST now allows for some flexibility so agencies may incorporate situational variances into the implementation of controls.

However, risk management must always factor into the equation. Systems managers need to collaborate with business units to calculate the potential losses for every action, and whether the projected benefits of the action outweigh the worst-case scenario for damages. If they do, then they can justify shortcuts.

We must be mindful of millennials. The next generation of workers brings an unprecedented sense of tech-savvy. But they often lapse into insider behaviors which could trigger major consequences: More than two out of five use the same passwords for multiple systems and apps, according to a 2016 survey we conducted, and only one-third use secure passwords for all of their accounts (compared to 53 percent of baby boomers who do).

As stated, continuous training and awareness efforts—along with the fostering of an information assurance culture—will go a long way in helping millennials and subsequent generations recognize the positive outcomes of cautious, preventative measures.

Ultimately, we must acknowledge that technology doesn’t run technology—humans do. In the modern world of transformation and business innovation, the designation of special rock star roles for groups such as 18F is probably necessary for relevancy (if not survival). But as talented as 18F members are, they are also subject to human failings as any of us. No one should be declared immune from the policies and practices which protect our data and systems, as everyone needs a structure of standards, rules, awareness and training to stay on course. Then, managers have to monitor these user activities to ensure standards are upheld.

With IT and business collaborating upon strategies that incorporate effective risk management into the rules, agencies won’t confine these teams into a state of inactivity. Instead, they’ll move forward to accomplish many great things, while earning a liberating sense of trust within their organization which will only result in more innovation. That’s what every agency should strive for and fortunately, it’s well within reach.

Editor's note: This article was updated April 7 to clarify that comments are the author’s opinion, not direct quotes.

JOIN THE DISCUSSION

Close [ x ] More from Nextgov