The Defense Innovation Board’s 10 Essential Rules for Developing Software

Alfa Photo/Shutterstock.com

Featured eBooks

Digital First
Cybersecurity & the Road Ahead
Emerging Technology Trends

The “commandments” adapt industry best practices for the Pentagon, but every agency could learn from them.

The Ten Commandments Moses brought down from Mt. Sinai may have seemed intuitive, but people still had trouble following them.

Similarly, the Defense Department will likely be fighting an uphill battle to meet new guidelines decreed by one of its top tech advisory groups.

The Defense Innovation Board on Thursday released a list of recommendations for how the Pentagon can make its IT acquisition process more effective, dubbed the Ten Commandments of Software. The board, composed of high-profile academic and industry technology experts, built the commandments to reflect best practices from the private sector and give the department a framework for rapidly developing and deploying software.

“The military does software in a way that’s not obvious to me,” said DIB Chairman Eric Schmidt, who previously served as executive chairman of Google’s parent company Alphabet, at the board’s public meeting. “Our strategy is to do things that are actionable … and see what the DOD likes or doesn’t like about our recommendations.”

Despite accounting for nearly half of all federal tech spending, the department houses to two of the government’s oldest IT systems and must manage an often convoluted acquisition process while keeping pace with the latest cyber and national security threats.

If the Pentagon can’t build a process for rolling out new software faster than the country’s enemies, “frankly, we’re screwed,” California Institute of Technology professor Richard Murray said.

While DIB created the list to help the Pentagon gain an edge on adversaries around the world, the underlying principles could help streamline software modernization at agencies across government.

The commandments are as follows:

  1. Make computing, storage, and bandwidth abundant to Defense Department developers and users.
  2. All software procurement programs should start small, be iterative, and build on success—or be terminated quickly.
  3. Budgets should be constructed to support the full, iterative life cycle of the software being procured with an amount proportional to the criticality and utility of the software.
  4. Adopt a DevOps culture for software systems.
  5. Automate testing of software to enable critical updates to be deployed in days to weeks, not months or years.
  6. Every purpose-built Defense software system should include source code as a deliverable.
  7. Every Defense system that includes software should have a local team of Defense software experts who are capable of modifying or extending the software through source code or API access.
  8. Only run operating systems that are receiving (and using) regular security updates for newly discovered security vulnerabilities.
  9. Data should always be encrypted unless it is part of an active computation.
  10. All data generated by Defense systems—in development and deployment—should be stored, mined and made available for machine learning.

Murray, who spearheads the board’s science and technology operations, made clear the commandments are by no means exhaustive, and members “anticipate updating these as we go forward and learn more.”

And that updating got underway almost immediately after the guidelines were presented.

Calling attention to the fact that the Pentagon has no classification code for programmers or data scientists, Schmidt underscored the need to have people with the technical expertise to actually manage software and put the commandments into action.

He suggested adding an eleventh commandment to the list: “Thou shalt count the number of programmers you actually have.”

The board has previously recommended a handful of culture changes aimed at making government service more attractive to talented technologists.

Board member Milo Medin, a vice president at Google Capital, also suggested the group define a clear set of metrics to tell whether a program is on track at different stages in the software development process.