Why the US Government Should Learn to Fail Fast Like Silicon Valley

Oleksiy Mark/Shutterstock.com

The failure mantras of Silicon Valley—“fail fast,” “fail often,” “fail forward”—have spawned a movement to view mistakes as opportunities to learn and change.

The failure mantras of Silicon Valley—“fail fast,” “fail often,” “fail forward”—have spawned a movement to view mistakes as opportunities to learn and change. More and more entrepreneurs, executives, and even customers accept that the path to innovation may involve some failures along the way.

But this concept hasn’t yet permeated the public sector. Take, for instance, the backlash this autumn against the Centers for Disease Control and Prevention in its handling of the Ebola virus. On Oct. 22, the CDC issued recommendations for monitoring and moving people with potential Ebola exposure; five days later, it released new instructions.

The shift in guidance unleashed a firestorm of criticism. Fox News took issue with the health agency’s contradictions, while the Guardian called out the CDC for its “chaotic response” to the crisis. According to a CBS News Poll conducted amidst the change, 57% of Americans said the CDC did a fair or poor job in its handling of Ebola.

Bear in mind the current outbreak of Ebola, for which there is no known cure, is unprecedented. Managing the situation as events are unfolding is no doubt extremely difficult. During a press call, Dr. Tom Frieden, the director of CDC, said his organization bases its judgments “on what we know and what we learn.” As the science and experience changes, “we adopt and adapt our guidelines and recommendations.”

So what accounts for the chorus of criticism? You could chalk it up to PR blunders: if only the agency had controlled its messaging, it could have avoided mass condemnation. It’s also possible the CDC’s new guidelines are overly restrictive, or indeed not restrictive enough—depending on who you listen to, be it the media, the nurses’ union, or New Jersey Governor Chris Christie.

There’s sound logic behind some of the criticism: the CDC, the World Health Organization, and others should indeed have acted more quickly, many experts agree. But let’s examine an unspoken message revealed by the dialogue: we have deeply inconsistent attitudes towards learning from missteps.

It seems that we—the public—have difficulty accepting that tackling new and complex challenges may entail changes in approach along the way. There’s even less support for the idea that government officials may make mistakes in the quest to learn and adapt.

As citizens, don’t we want government officials who are thoughtful and responsive, not impetuous and over-reactive? If we indeed think that policymakers should make decisions based on evidence and what they learn along the way, we must understand that they may reverse their advice, sometimes quickly if necessary.

The timeframe for making such adjustments may not match what we expect, but allowing for the tempo dictated by discoveries is part of the shift in attitude needed to support smarter policies. And, we need to allow, on occasion miscalculations by our leaders—or even failure.

Could public and media responses to apparent governmental missteps adopt some of the forbearance that startups now enjoy? Without some nuance in how we make sense of small, instructive failures, we run the danger of punishing all learning from experience. Such pressure to avoid any missteps might actually result in bigger blunders down the line, a result of missing opportunities to learn from small failures.

Public-sector leaders, meanwhile, fall short when it comes to explaining their own process for producing useful insights. Our leaders need to present their learning more openly, explaining how evidence-based decisions require experience, testing, and research. They need to carry out their work following proven methods that avoid failures due to poor preparation, while maximizing the value of instructive failures.

To start, savvy public sector leaders need to map the intended effects of their planned actions, then test crucial assumptions early on, keeping an eye on attendant risks. As they execute plans, their teams need to continually collect and examine data—and draw conclusions from what they glean. At the end of every project, they need to specify what they have learned and ensure the lessons are shared not just within a given agency or office, but more widely across the public sector.

Of course there are differences between Silicon Valley and the public sector, just as there are differences between creating a new shopping app and devising a plan to fight a deadly virus. But we’ve come to accept that startups tweak their offerings in response to discoveries, that military leaders change tactics as battlefield conditions shift, and that doctors update their treatment plans as new test results come in. It’s time for us all to embrace real-world learning in the public sector, too.

(Image via Oleksiy Mark/Shutterstock.com)