recommended reading

National lab, IBM team up on supercomputer initiative

IBM's Dawn supercomputer is a cousin to the forthcoming Vulcan computer.

IBM's Dawn supercomputer is a cousin to the forthcoming Vulcan computer. // National Nuclear Security Administration

IBM and the Lawrence Livermore National Laboratory are working together on a project to leverage supercomputing to help industry better spot trends and more quickly develop new technologies, the pair announced Wednesday.

The new supercomputer, called Vulcan, will use some of the same ultrafast technology underlying Lawrence Livermore’s Sequoia, recently named the world’s fastest supercomputer. Sequoia monitors, evaluates and tests the nation’s nuclear stockpile.

Vulcan, which is due in 2012, will focus instead on crunching mostly unclassified data to speed the development of new technologies in applied energy, green energy, manufacturing, data management and other fields, IBM said.

Sen. Dianne Feinstein, D-Calif., announced the Livermore deal during an IBM event at the Capitol on Wednesday.

Feinstein, who leads the Senate Intelligence Committee, praised Sequoia for “maintaining the safety, security and reliability of nuclear weapons stockpiles without underground testing.

“This machine will be an important tool to extend the life of aging weapons and anticipate future problems resulting from aging, leading to smaller nuclear weapons stockpiles,” she said.

The United States currently maintains a large stockpile of nuclear weapons so it can have spares in case some weapons don’t function, a Feinstein spokesman said. If the government had greater confidence that weapons in the stockpile were operating correctly, fewer spares would be needed, he said.

Vulcan could use similar data crunching power to Sequoia to help U.S. businesses be more competitive in emerging fields, Feinstein said.

The IBM event focused largely on how advances in supercomputing and big data analysis can benefit government and the private sector.

Big data typically refers to large amounts of unstructured information -- anything from scanned books to satellite feeds to social media posts to data outputs from the Large Hadron Collider at CERN -- which were once too complex for existing computers to sift through and extract meaningful patterns.

The Large Hadron Collider, for instance, produces 40 terabytes of data per second. That’s nearly three times the data comprising all the books in the Library of Congress.

The White House invested $200 million in March in research and development initiatives related to the mining, processing, storage and use of big data.

The best thing the government can do to support the development of big data technology is to set standards for how data should be organized and invest in basic research in computer science and mathematics to tease patterns out of massive data troves, Steven Ashby, deputy director for science and technology at the Pacific Northwest National Laboratory, said during the IBM event.

The government also can invest in proofs of concept in particular fields such as financial analysis to show the value of big data analysis to the private sector, Ashby said.

David McQueeney, vice president for software in IBM’s research division, compared big data to a natural resource like coal or oil -- there’s a lot of it out there, but it’s difficult to extract and must be refined before it’s useful.

“It’s underexploited because we haven’t had computer power at the right price point to do the rather complicated extraction of unstructured data to provide real insights,” he said. “It’s been underappreciated because the size of the systems that can hold and manipulate the data haven’t been present.”

Threatwatch Alert

Thousands of cyber attacks occur each day

See the latest threats


Close [ x ] More from Nextgov

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • It’s Time for the Federal Government to Embrace Wireless and Mobility

    The United States has turned a corner on the adoption of mobile phones, tablets and other smart devices, outpacing traditional desktop and laptop sales by a wide margin. This issue brief discusses the state of wireless and mobility in federal government and outlines why now is the time to embrace these technologies in government.

  • Featured Content from RSA Conference: Dissed by NIST

    Learn more about the latest draft of the U.S. National Institute of Standards and Technology guidance document on authentication and lifecycle management.

  • A New Security Architecture for Federal Networks

    Federal government networks are under constant attack, and the number of those attacks is increasing. This issue brief discusses today's threats and a new model for the future.

  • Going Agile:Revolutionizing Federal Digital Services Delivery

    Here’s one indication that times have changed: Harriet Tubman is going to be the next face of the twenty dollar bill. Another sign of change? The way in which the federal government arrived at that decision.

  • Software-Defined Networking

    So many demands are being placed on federal information technology networks, which must handle vast amounts of data, accommodate voice and video, and cope with a multitude of highly connected devices while keeping government information secure from cyber threats. This issue brief discusses the state of SDN in the federal government and the path forward.

  • The New IP: Moving Government Agencies Toward the Network of The Future

    Federal IT managers are looking to modernize legacy network infrastructures that are taxed by growing demands from mobile devices, video, vast amounts of data, and more. This issue brief discusses the federal government network landscape, as well as market, financial force drivers for network modernization.


When you download a report, your information may be shared with the underwriters of that document.