For the 17 agencies that make up the intelligence community, keeping secrets is a priority. But intelligence leaders have learned that if they share information — not hoard it — they can fulfill their missions more effectively.
In the past five years, the IC has been using technology to break down barriers between agencies and to make intelligence data a community asset, says Jennifer Kron, the acting tech chief for the intelligence community.
The IC is shifting away from a “castle-and-moat approach” (in which users within an organization can see only certain types of data) toward a model where anyone in the intelligence community with the appropriate security clearances can view that information, Kron says.
Data sharing is key to efficiency at some of the government’s largest agencies. By experimenting with a range of technologies, IT leaders from across government hope to minimize overlap and create new capabilities and backstops. These efforts include improving cloud security to allow for the sharing of sensitive data, managing identification and authentication services to stretch across multiple agencies, and maximizing interoperability among government systems.
But getting there hasn’t been easy. For decades, each intelligence agency ran its own network with its own enterprise IT systems, but information sharing was the exception, not the rule. When agencies did share data, it was done via endpoint devices that lacked the highest levels of security. Not only were these agencies not living up to the data-sharing goals of the Intelligence Reform and Terrorism Prevention Act of 2004, Kron says, but also the entire approach was wildly inefficient.
“We were buying the same capabilities, sometimes from the same vendor, 17 different times, for systems that didn’t talk to each other,” she says.
Enable Information Sharing in the IC
About five years ago, the IC moved away from siloed IT and established the IC Information Technology Enterprise. ICITE is a platform of nine shared services, from security to networking, email and virtual desktops, all delivered via a private cloud.
“Each part of the IT enterprise is provided by one or two agencies that do the thing they are best at, which they then make available to the rest of the community,” Kron says. She formally serves as the acting assistant director of national intelligence and CIO for the IC. “By evolving into this enterprise approach, we make it a lot easier to share information, enhance integration, improve our security and become more efficient.”
But moving from legacy infrastructure to the new cloud environment will take years, Kron says. No staffers are required to use ICITE, though Kron notes IC employees may be unknowingly using aspects of it, such as its identification and authentication services. Instead, the IC is migrating its legacy systems during normal refresh cycles. Leaders hope the benefits entice intelligence staffers to adopt these services voluntarily.
For now, the most significant barriers to information sharing are cultural, not technological, she says. Users may believe that sensitive information isn’t secure in the cloud or that legal or policy issues prevent them from sharing it.
“The IT is not the hardest part of this,” she says. “Some people start out thinking sensitive information can’t go into the cloud. But it can. We’re able to execute fine-grained access control, so that only the right people can see the right data, no matter how it’s classified. When we ask what policy is stopping them from sharing, it turns out to be ‘that’s not how we’ve been doing things.’ That’s not policy, that’s culture.”
Legacy Systems Can't Talk to Each Other
Kron says these barriers are disintegrating, but it’s a long process of gaining users’ confidence. “It’s about trust,” she says. “Agencies need to become accustomed to trusting each other the way they trust themselves. That’s what we really have to work on.”
Federal officials looking to share data face several obstacles, says Lawrence Pingree, a vice president with But legacy systems that lack the ability to communicate with each other may be the feds’ biggest barrier.
“Agencies have huge amounts of infrastructure, and it’s not like they all buy one piece of software that everyone uses,” he says. “Even when they use similar systems, there’s a fair amount of customization for each agency’s mission. And they’re all competing with tech companies for the talent that can make these systems communicate.”
Another obstacle agency leaders can face is that the standards created for exchanging data are often not universal.
“Interoperability is still a big problem,” he says. “If we want a more efficient government that spends less money, we need to standardize more and share information. Otherwise, agencies have to duplicate efforts, and that’s not efficient.”Gartner.
Move Beyond Intelligence Data Silos
The National Technical Information Service is attempting to bridge that gap. NTIS provides data service solutions to agencies via joint venture partnerships with more than two dozen companies.
Among its other duties, NTIS manages the Social Security Administration’s Death Master File. That information is used by financial services firms and government agencies to prevent identity fraud, and it helps private-sector firms verify employee eligibility via the U.S. Citizenship and Immigration Services’ Verification Information System.
Data interoperability is a problem, agrees NTIS Director Avi Bender. He says agencies also need to improve how they deliver data to the public by deploying more application program interfaces (APIs) that allow apps and websites to grab information from government databases. But Bender says what agencies need most is more private sector–style innovation — the ability to think beyond a siloed approach and move quickly to implement new solutions.
“We need new business models for innovation that promote engagement between the private sector and the federal government around pressing national data issues,” he says. “This is what NTIS provides.”
Agencies' Complement Their Capabilities via Data Sharing
Data sharing between disparate agencies, especially when one department’s mission complements another’s, can further the government’s effectiveness.
Consider the Environmental Protection Agency, which has multiple data-sharing agreements in place with other entities, including the Defense, Energy and Transportation departments.
In March 2015, the EPA’s Office of Chemical Safety and Pollution Prevention established a data-sharing agreement with the Food and Drug Administration’s Office of Foods and Veterinary Medicine. The goal is to share information where both agencies have jurisdiction over substances incorporated in food, drugs and cosmetics.
For example, a maker of an anti-microbial food wash must demonstrate to the FDA that its product won’t affect the safety of the food being washed; that same manufacturer must prove to the EPA that this chemical won’t harm the environment. Sharing information allows both agencies to accomplish their missions more efficiently, says Marianna Naum, an FDA policy analyst.
“This has been most helpful in our Food Contact Substance Notification program, which has a very tight timeline and where swift data transfer is extremely useful,” she says. The agreement allows researchers to choose their own process for sharing data but establishes guidelines for handling confidential information as well as a foundation for further cooperation. For example, in some cases, sensitive information can be shared only via encrypted disk.
The IC’s Kron says increased cooperation is a trend she has seen across the intelligence community. “The level of collaboration right now among the CIOs of all the agencies goes beyond anything I have seen,” she says. “They really are thinking about doing what’s best for the community.”
This content is made possible by FedTech. The editorial staff of Nextgov was not involved in its preparation.