recommended reading

White House mulls making NASA a center for federal cloud computing

NASA and the Obama administration's top technology officer are considering a NASA cloud computing prototype to test the president's plan for agencies to outsource information technology services to a shared platform.

President Obama has called for slashing federal IT infrastructure costs by relying on cloud computing, a process where agencies pay for Internet access to shared hardware and software that is housed in an off-site data center.

One of those sites could be NASA. Officials at the space agency and the Office of Management and Budget have "broached the idea of NASA becoming an IT service provider," said Mike Hecker, NASA's associate chief information officer for architecture and infrastructure. But, "NASA as an IT service provider takes us into a new realm. We're still debating if that's a good idea or not."

NASA is developing a cloud computing model, called Nebula, to support some of its projects. For example, the agency uses Nebula to share NASA images and statistics with international partners and academic institutions. The system provides high-capacity computing, storage and network connectivity.

Federal CIO Vivek Kundra, Obama's top technology executive, is examining many alternatives for innovation in the cloud, including using Nebula as a centralized platform to service multiple agencies, OMB officials said. Chris Kemp, CIO at NASA's Ames Research Center, who is spearheading the program, is working with the federal government's cloud working group, officials added.

NASA has not committed to developing Nebula into a governmentwide cloud platform, Hecker emphasized. The agency's mission is to explore space and science for the benefit of the public, with IT serving as a tool to fulfill that mission, officials stressed.

"Providing IT services to others is not something the agency has historically done," he said. "However, cloud computing does provide some interesting opportunities and it is being discussed internally."

But given Ames' Silicon Valley location, the center's prowess in supercomputing and Nebula's compliance with federal IT standards, the platform could be a good fit for governmentwide cloud computing, analysts said.

Nebula "could be used in a way to run some new applications on it, to see whether it would make sense," said Ben Pring, a research vice president at Gartner Research who studies cloud computing. "Absolutely it would be a good, smart thing to do that."

Obama's fiscal 2010 budget proposal envisions optimizing cloud computing by "scaling pilots to full capabilities and providing financial support to accelerate migration," the budget stated. The fiscal plan acknowledges the effort will involve upfront costs, but the expense should be more than offset by savings from consolidating data centers.

But the White House's drive to turn the federal IT infrastructure into a cloud-computing business model will entail an estimated 10-year migration because several hurdles first must be cleared, including inadequate data protection, the upfront costs, and overcoming the change-averse attitude of the federal bureaucracy, Pring said.

"The whole notion of cloud models is obviously being pushed from the top down, and I think that's to be applauded," he said. "Having said that, I think there's a significant journey ahead. Cloud isn't going to replace all preceding models anytime soon."

NASA is at the beginning of the journey. At present, the agency's IT systems are not centralized. It has about 70 data centers with various levels of efficiency and availability that NASA is trying to contract to two outsourced data centers.

The primary goal of concentrating IT operations is to improve IT security, Hecker said. Collaboration among employees and cost-cutting also are important objectives, he added.

Threatwatch Alert

Thousands of cyber attacks occur each day

See the latest threats


Close [ x ] More from Nextgov

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Featured Content from RSA Conference: Dissed by NIST

    Learn more about the latest draft of the U.S. National Institute of Standards and Technology guidance document on authentication and lifecycle management.

  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.


When you download a report, your information may be shared with the underwriters of that document.