Soft Landings

Agencies concerned about sensitive data take incremental steps into the cloud.

Is it possible to walk on the clouds while keeping one foot on the ground? Yes, and sometimes it's preferable too.

As federal agencies move ahead to fulfill President Obama's call for a broader embrace of Internet-based technology operations, some are reluctant to entrust their data and their systems to the cloud entirely. Perhaps some processes are best kept closer to home.

Federal Chief Information Officer Vivek Kundra says so. In a May 2010 report he described a Colorado state effort to implement a hybrid cloud using a private cloud, an enterprise computing architecture protected by a firewall, for highly secure data; an internal network for archival storage; and a public cloud where vendors can offer Internet-based applications and services.

Kundra paints a picture of a system in which disparate elements find their homes in and out of the public cloud.

Leading cloud vendors acknowledge this blended scenario can be desirable. Despite all the cloud accomplishments of Google, for example, it still can't replace an agency's finance systems. Given such limitations, "there will be no agency that has a total cloud environment across their entire IT infrastructure," says David Mihalchik, Google Federal business development manager.

Public vs. Private

Why take to the cloud incrementally? At the Federal Communications Commission, it has been largely a matter of prudence. Small steps mean small mistakes, if any. "Otherwise, if you just move it all from here to there, you are just going to move the problem from here to there," says FCC's Managing Director Steven VanRoekel.

FCC is deep into reengineering its website--not simply a surface remake but an architectural overhaul. As part of that effort the agency has contracted with managed services provider Terremark to take all public-facing elements into the cloud. Many agree Web pages, videos, photos and downloadable content can be managed efficiently and cost-effectively in the cloud.

Transactional data such as licensing and public comments will stay in house, partly because any change to these processes would require a commission vote, but also because FCC wants to move with caution into the new technology.

"The big thing here is getting the architecture right, to future-proof it," VanRoekel says. "Being able to move in this incremental way means we can take our time really getting the architecture right. We can set ourselves up for long-term success."

Moving public-facing elements to the cloud first makes sense to Mark White, chief technology officer for the U.S. federal practice at Deloitte Consulting. According to White, front-end applications can stand alone, whereas deeply integrated functions likely will have to remain housed together. "Integrated financials, manufacturing, inventory, logistics and supply chain will continue to have a strong business case for on-premise, in-house operation and integration," he says.

While architectural concerns help drive decisions about cloud implementation, two other elements of equal concern are cost and security. Cloud deployments often save money, but not always. Certainly the cloud is secure, but by whose standard?

As Census Bureau CIO Brian McGrath ponders security in the cloud, he is treading with care, opting for a private cloud solution to manage sensitive census data, as well as highly confidential information that will flow from the 2012 economic census.

At the same time he sees financial benefits in the public cloud. In the most recent census it made fiscal sense for the bureau to shift customer relationship management applications, email blasts and blogging to the public cloud using primarily Akamai Technologies, a Web-based content distribution provider. The speed, scalability and economies of scale in the public cloud make financial sense in these areas.

McGrath describes the economics of public versus private cloud not only in terms of cost, but also overall maturity of the technology. Consider this: Any government agency that is first to sign on with a cloud vendor must ensure the vendor is certified with the National Institute of Standards and Technology. That's a lengthy and expensive process, and McGrath does not want to carry the ball. Rather, he says, it often makes the most sense to keep some elements private until a public cloud vendor that already has crossed the NIST threshold emerges.

Sometimes, too, an incremental solution can come when an agency parcels out one element of a system to a cloud vendor while handing off related functions to another that is more closely tied to the agency's internal systems.

When the Transportation Department launched the Cash for Clunkers program, for example, it immediately was saddled with vast amounts of data and transactions.

The department had to create integrated systems between itself and the thousands of car dealerships participating in the rebate offers to consumers who traded in their old cars for gas-efficient ones. Terremark offered the needed scale and speed, but Transportation determined that monitoring and security should be verified by another vendor, one outside the cloud provider's own infrastructure.

The department crafted a two-part solution. While Terremark crunched the big data via the cloud, vendor Layer 7 Technologies installed government-furnished equipment at the Terremark site, within Transportation's architecture. In this way, department staff and systems outside the cloud were able to monitor and verify the security of activities inside the cloud.

Hybrid Hurdles

There are any number of reasons for making a gradual ascent into the cloud. It's a way to cut costs of routine functions while keeping mission-critical systems in house. Hybrid systems also add a level of security on top of what cloud providers offer. These solutions can provide a means to keep integrated systems bound tightly, while spinning off functions that can operate independently.

At the same time, a hybrid system offers one significant potential hurdle.

Running parallel systems can double the operating cost or complexity, potentially erasing the cloud's financial and efficiency gains.

At Census, McGrath has some concerns about this, but he is confident vendors will come through with products that allow users to transition smoothly between their private and public cloud structures. "The tools are in the early stages, but given the push for cloud, I think those tools can and will mature very rapidly," he says.

That's the promise Mihalchik at Google Federal is making, saying interoperability is at the top of the company's cloud to-do list. "We think it is very important, and our engineering team is focused on making sure our cloud platform has those capabilities," he says.

As cloud technologies mature, the ability to bridge legacy systems will be a crucial factor in allowing IT managers to build dynamic, cost-effective hybrids. That way agencies can leverage the speed, savings and scalability of the cloud while still keeping vital processes safe within their walls.

Adam Stone, a freelance journalist based in Annapolis, Md., writes about federal management issues, technology and a broad range of business topics.

NEXT STORY: AMA: Competition Improves EHRs