recommended reading

Stop Loose Nukes With Big Data and Crowdsourcing, Experts Urge


The United States should tap big data analytic techniques and crowdsource commercial imagery to track the proliferation of nuclear weapons, the Defense Science Board recommended in a report released last week.

The board’s “Task Force Report on Assessment of Nuclear Monitoring and Verification Technologies” said the Departments of Defense, Energy and State along with intelligence agencies need new technology and systems to track the nuclear activities of nations that have acknowledged developing nuclear weapons and “threshold” states that have small or nascent programs.

The task force was led by Miriam John, a consultant and vice president of the Energy Department’s Sandia National Laboratory in Albuquerque, N.M., and Donald Kerr, the principal deputy director of National Intelligence from 2007 through 2009. They recommended the United States develop a national strategy to “to identify proliferants early or well before the fact” and undertake “a major diplomatic effort to build trust and confidence.”

This will require technologies beyond radiation and seismic monitors to monitor compliance by nuclear treaty nations.  For example, the report suggested adopting tactical intelligence, surveillance and reconnaissance technologies, which were developed for use in Iraq and Afghanistan to suppress roadside bombs, and adapting them for nuclear monitoring, especially where access is limited or nonexistent.

In addition, technologies for extracting vast quantities of data, “being developed commercially in the information technology industry, and for other purposes in DoD and the [Intelligence Community], need to be extended and applied to nuclear monitoring,” the report said in an elliptical reference to National Security Agency’s global data mining and analysis efforts first exposed by former NSA contractor Edward Snowden in June, 2013.

“Data gathered from the cyber domain establishes a rich and exploitable source for determining activities of individuals, groups and organizations needed to participate in either the procurement or development of a nuclear device,” the report said. Advances in data exfiltration hold promise as well, it said.

The report noted limitations to cloud-based big data analytics, however: “The transmission latency among storage devices in cloud based architecture are overwhelming the processing time for computational operations. As such, the analytics need to stay near the data; i.e., in a “back to the future” sense, large banks of co‐located processors will form the big data processing architecture of the near future.”

Crowd-sourcing of commercial satellite imagery offers the chance to cut costs and widen global coverage of nuclear sites, the report said, with a key caveat -- namely the quality of both the data and analysis. “There have already been major analytical errors made by untrained imagery analysts who have published openly,” the authors noted.

The task force also recommended the United States:

  • Expand the use of open source and commercial information.
  • Continue and expand the augmentation of data from national intelligence collection systems with imagery and radar from commercial systems.
  • Conduct systems studies and engage operators early in development to more quickly implement advanced radiation detection systems.

To drive these points home, the Task Force said it adopted the motto, “We can’t let our treaties get ahead of our monitoring and verification headlights.”

(Image via Nevada31/

Threatwatch Alert

Thousands of cyber attacks occur each day

See the latest threats


Close [ x ] More from Nextgov

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • It’s Time for the Federal Government to Embrace Wireless and Mobility

    The United States has turned a corner on the adoption of mobile phones, tablets and other smart devices, outpacing traditional desktop and laptop sales by a wide margin. This issue brief discusses the state of wireless and mobility in federal government and outlines why now is the time to embrace these technologies in government.

  • Featured Content from RSA Conference: Dissed by NIST

    Learn more about the latest draft of the U.S. National Institute of Standards and Technology guidance document on authentication and lifecycle management.

  • A New Security Architecture for Federal Networks

    Federal government networks are under constant attack, and the number of those attacks is increasing. This issue brief discusses today's threats and a new model for the future.

  • Going Agile:Revolutionizing Federal Digital Services Delivery

    Here’s one indication that times have changed: Harriet Tubman is going to be the next face of the twenty dollar bill. Another sign of change? The way in which the federal government arrived at that decision.

  • Software-Defined Networking

    So many demands are being placed on federal information technology networks, which must handle vast amounts of data, accommodate voice and video, and cope with a multitude of highly connected devices while keeping government information secure from cyber threats. This issue brief discusses the state of SDN in the federal government and the path forward.

  • The New IP: Moving Government Agencies Toward the Network of The Future

    Federal IT managers are looking to modernize legacy network infrastructures that are taxed by growing demands from mobile devices, video, vast amounts of data, and more. This issue brief discusses the federal government network landscape, as well as market, financial force drivers for network modernization.


When you download a report, your information may be shared with the underwriters of that document.