Stop Loose Nukes With Big Data and Crowdsourcing, Experts Urge

Nevada31/Shutterstock.com

Defense Science Board says new technologies and a new strategy are needed to counter nuclear proliferation.

The United States should tap big data analytic techniques and crowdsource commercial imagery to track the proliferation of nuclear weapons, the Defense Science Board recommended in a report released last week.

The board’s “Task Force Report on Assessment of Nuclear Monitoring and Verification Technologies” said the Departments of Defense, Energy and State along with intelligence agencies need new technology and systems to track the nuclear activities of nations that have acknowledged developing nuclear weapons and “threshold” states that have small or nascent programs.

The task force was led by Miriam John, a consultant and vice president of the Energy Department’s Sandia National Laboratory in Albuquerque, N.M., and Donald Kerr, the principal deputy director of National Intelligence from 2007 through 2009. They recommended the United States develop a national strategy to “to identify proliferants early or well before the fact” and undertake “a major diplomatic effort to build trust and confidence.”

This will require technologies beyond radiation and seismic monitors to monitor compliance by nuclear treaty nations.  For example, the report suggested adopting tactical intelligence, surveillance and reconnaissance technologies, which were developed for use in Iraq and Afghanistan to suppress roadside bombs, and adapting them for nuclear monitoring, especially where access is limited or nonexistent.

In addition, technologies for extracting vast quantities of data, “being developed commercially in the information technology industry, and for other purposes in DoD and the [Intelligence Community], need to be extended and applied to nuclear monitoring,” the report said in an elliptical reference to National Security Agency’s global data mining and analysis efforts first exposed by former NSA contractor Edward Snowden in June, 2013.

“Data gathered from the cyber domain establishes a rich and exploitable source for determining activities of individuals, groups and organizations needed to participate in either the procurement or development of a nuclear device,” the report said. Advances in data exfiltration hold promise as well, it said.

The report noted limitations to cloud-based big data analytics, however: “The transmission latency among storage devices in cloud based architecture are overwhelming the processing time for computational operations. As such, the analytics need to stay near the data; i.e., in a “back to the future” sense, large banks of co‐located processors will form the big data processing architecture of the near future.”

Crowd-sourcing of commercial satellite imagery offers the chance to cut costs and widen global coverage of nuclear sites, the report said, with a key caveat -- namely the quality of both the data and analysis. “There have already been major analytical errors made by untrained imagery analysts who have published openly,” the authors noted.

The task force also recommended the United States:

  • Expand the use of open source and commercial information.
  • Continue and expand the augmentation of data from national intelligence collection systems with imagery and radar from commercial systems.
  • Conduct systems studies and engage operators early in development to more quickly implement advanced radiation detection systems.

To drive these points home, the Task Force said it adopted the motto, “We can’t let our treaties get ahead of our monitoring and verification headlights.”

(Image via Nevada31/Shutterstock.com)