recommended reading

Is 16 Minutes the Best We Can Do for Tornado Warnings?

For those caught in the path of the EF 4 tornado in Oklahoma last week, 16 minutes were all they had to prepare themselves for the 200 mph winds that would flip cars, twist steel, and all but completely destroy a 1.3 mile-wide swath of the town of Moore.

Sixty or so years ago, they would have been lucky to have any warning at all. A form of the National Weather Service has been issuing tornado warnings only since 1938. Previously, the word "tornado" had been banned by the Army Signal Corps, which used to monitor the weather. To the corps, tornadoes were much too unpredictable to track, and the word incited more panic than meaningful response.

The House Environment Subcommittee is drafting a bill to spur forecasting research and technology procurement in the National Weather Service, in part, to extend lead times for tornado warnings. The National Weather Service had previously received $23.7 million in funds for forecasting in the Sandy Relief bill. The current bill would seek spur forecasting research investments (there's no specific dollar figure yet), and to replace satellite systems that will degrade over the next few years. 

Previously, the Government Accountability Office had warned of a potential satellite data gaps in 2016 or 2017 because of old satellites not being replaced. Also propelling the bill is the fact that European forecasters are outpacing the Americans (European weather models more accurately predicted the path of Hurricane Sandy than the American forecaster). Barry Lee Myers, CEO of Accuweather put it in stark terms at a recent congressional hearing on the bill: "From a national-security standpoint, relying on other countries for better weather models places America in a weak and subservient position."

In a lot of ways, and especially compared with the tracking of large, slow-moving storms such as hurricanes, we haven't progressed very far in terms of issuing tornado warnings. The 16 minutes of warning in Moore, Okla., is close to the average of 13 minutes. Meanwhile the rate of false alarms for tornado warnings is near 75 percent, according to NOAA. There's not much to do in the path of a tornado besides shelter in place.

"Because we can see hurricanes, because they are large, and because they move relatively slowly over land and sea areas, we can evacuate people," Myers testified. "With regard to tornadoes, we do the opposite, we expect people to ride out the storm in their bathtubs. That's unacceptable."

For the near term, or at least the next decade, those 15 or so minutes might be the best we can do. But NOAA's Lans Rothfuz explains that the next decade or so might see a complete overhaul in the way tornadoes are forecast. Right now, the only way to issue a warning for a tornado is to detect one forming. NOAA calls this "warn on detection." The ideal is a system based on warning on a forecast that is based on a complete computer simulation of a storm system.

"If you're relying on just detecting, some thunderstorms last for about an hour and then they are gone," says Rothfuz, who works in the National Storm Prediction Center. "And so, if you're relying on detection the best you can hope for is maybe 15—30 minutes at best—detections, lead times."

There are three obstacles to making this transition, Rothfuz says:

One: We need better science—more accurate and complex weather forecasting models that can recreate a storm on a computer. To simulate a storm is to predict its path with greater reliability.

Two: We need better, denser weather data. "In order to get the data that these models need, we have to take it down to finer and finer resolutions," he says. Currently, weather balloons monitor conditions every 100 miles (to use the resolution metaphor, this makes for a pixelated map). Increasing data points means implementing more advanced radars, such as as the Phased Array Radar system, which collects data one-sixth faster than more common systems. Or using new technologies, such as GPS profiling, which analyzes regular GPS signals as the bend through the atmosphere. Or crowd-sourcing, collecting data from passenger cars equipped with temperature and rain sensors. 

Three: We need the sheer computer horsepower to run dense data through complex models.

These changes are in the works, but they may take a while to come to fruition. "We're still talking about 10 years out for that type of technology," Rothfuz says.

Threatwatch Alert

Thousands of cyber attacks occur each day

See the latest threats


Close [ x ] More from Nextgov

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Modernizing IT for Mission Success

    Surveying Federal and Defense Leaders on Priorities and Challenges at the Tactical Edge

  • Communicating Innovation in Federal Government

    Federal Government spending on ‘obsolete technology’ continues to increase. Supporting the twin pillars of improved digital service delivery for citizens on the one hand, and the increasingly optimized and flexible working practices for federal employees on the other, are neither easy nor inexpensive tasks. This whitepaper explores how federal agencies can leverage the value of existing agency technology assets while offering IT leaders the ability to implement the kind of employee productivity, citizen service improvements and security demanded by federal oversight.

  • Effective Ransomware Response

    This whitepaper provides an overview and understanding of ransomware and how to successfully combat it.

  • Forecasting Cloud's Future

    Conversations with Federal, State, and Local Technology Leaders on Cloud-Driven Digital Transformation

  • IT Transformation Trends: Flash Storage as a Strategic IT Asset

    MIT Technology Review: Flash Storage As a Strategic IT Asset For the first time in decades, IT leaders now consider all-flash storage as a strategic IT asset. IT has become a new operating model that enables self-service with high performance, density and resiliency. It also offers the self-service agility of the public cloud combined with the security, performance, and cost-effectiveness of a private cloud. Download this MIT Technology Review paper to learn more about how all-flash storage is transforming the data center.


When you download a report, your information may be shared with the underwriters of that document.