Sandy shows storm-prediction progress

As bad as Sandy was, it could have been far worse. Advances in the government's prediction tools allowed early, accurate warning.

sandy over midatlantic

Hurricane Sandy bears down on the mid-Atlantic on Oct. 29th in this NASA satellite image.

By any account, the super-storm known as Hurricane Sandy was devastating, claiming 113 American lives and potentially an estimated $50 billion in economic damage as it wreaked havoc on the east coast on the way to becoming the second-costliest U.S. storm behind only Hurricane Katrina.

But the damage could have been much worse – in loss of life and economic costs – were it not for the astoundingly accurate forecasts of Sandy’s path from the National Hurricane Center.

Resource

Download a slide presentation by Dr. Sundararaman Gopalakrishnan on Hurricane Weather Research and Forecasting here.

Some of the Center’s model runs pegged the hurricane’s landfall five days out within 30 miles of the actual landfall point. That is 10 times better than the public and emergency personnel bracing for the storm could have hoped for 20 years ago, when NHC’s historic average for 72-hour (three-day) hurricane forecasts were only accurate to about 345 miles of landfall. Had that still been the case here, the best forecast would have predicted landfall anywhere from northeastern Connecticut to northern South Carolina.

In fact, if a storm like Sandy happened more than two decades ago, there’s a good chance meteorologists wouldn’t have predicted landfall at all. Instead they would have used historical data to conclude such a system would take a right turn from a high-pressure system ridge into the eastern Atlantic ocean and away from the coastline, according to Dr. Sundararaman Gopalakrishnan, a senior meteorologist at the National Oceanic and Atmospheric Administration's Atlantic Oceanographic and Meteorological Laboratory in Miami.

“If Hurricane Sandy happened 20 years back, it would almost certainly have been a disaster without much warning,” said Gopalakrishnan, who witnessed predictive computer models as they shifted Sandy’s trajectory west toward New Jersey as more and more data came in.

He recalls thinking “Oh no. This is not good,” he said.

“I’m feeling sad about the deaths caused by this hurricane, but I think if it was not for these kinds of forecasts, there would be much, much more,” Gopalakrishnan said. “I can clearly see that from forecasting this storm. This was a unique situation as Sandy was captured in hybrid circulation from the land. I wish it would have turned to the right (east), but it is better to know than not know.”

The technology behind the forecasts

When NOAA  – a scientific agency housed within the Department of Commerce that consists of six agencies, including the National Weather Service – was formed in 1970, meteorologists had satellites and past historical data to help predict the weather, but they didn’t have supercomputers like the IBM Bluefire, which is housed in the bottom levels of the National Center for Atmospheric Research in Boulder, Colo. It has a peak speed of 76 teraflops, or about 76 trillion floating-point operations per second.

Image showing hurricane prediction accuracy

The accuracy of predictions for the storm beat anything that could have done as recently as 20 years ago. (Image: NOAA)

Massive amounts of data, however, aren’t worth much without modeling, and the NHC uses more than 30 different models to predict storm intensity. In 2007, the NHC began utilizing the Hurricane Weather Research and Forecasting (HWRF) model, which analyzes big data collected from satellites, weather buoys and airborne observations from Gulfstream-IV and P-3 jets and churns out high-resolution computer-modeled forecasts every six hours.

Over the past year, Gopalakrishnan said hurricane forecasts have further improved through the use of higher-resolution computer models. Points spaced on weather maps have increased in resolution from 9 kilometers to 3 kilometers, making HWRF the “highest resolution hurricane model ever implemented for operations in the National Weather Services” according to Richard Pasch, senior hurricane specialist at the NHC.

Forecasting a hurricane takes big data, high-resolution models that incorporate large-scale physics and an accurate representation of initial conditions, and NWRF’s simulations have been successful thus far, Gopalakrishnan said, helping to improve hurricane forecasts by up to 20 percent.

Beginning five days before Sandy’s landfall, the NHC’s NWRF model ran 23 simulations of the hurricane’s expected path – one every six hours, and each simulation taking about 85 minutes to complete.

Simulations and other such data are shared immediately with other federal agencies including the Federal Emergency Management Agency, according to Erica Rule, spokesperson for the NOAA.

“When there is an active storm threatening landfall, there is extremely close coordination,” Rule said.

Not everything about Sandy was predicted as accurately as its landfall. Critics have questioned why predicted wind speeds in Washington, D.C. were higher than what was experienced, and the storm’s predicated landfall missed by a few hours.

Yet these errors in NHC forecasting seem small compared to yesteryear’s predictions.

Today, the NHC’s average miss in hurricane landfall predictions is a scant 100 miles. In 1970, NHC meteorologists missed by an average of 518 miles. Major improvements have been recognized in precipitation estimates and wind speed forecasts as well, which can prove vital in predicting storm surges and imminent flooding.

The increased predictive power decreases what scientists call a hurricane’s “cone of uncertainty,” or the area that includes all the possible paths a hurricane might go.

The better the predicted path of a hurricane, the less uncertainty of where, when and who it might strike, and Gopalakrishnan said “next-generation” efforts in hurricane prediction promise to further reduce uncertainty.