NOAA eyes upgrade of outdated tsunami warning centers

Stations in Hawaii and Alaska should have common hardware and software to process increased amounts of data.

The two tsunami warning centers the National Oceanic and Atmospheric Administration operates are "remarkably outdated" and need upgrades, said John Orcutt, professor of geophysics at the Scripps Institution of Oceanography in La Jolla, Calif.

Orcutt, who chaired a National Academy of Sciences committee that issued a report in September 2010 on the nation's tsunami warning and forecast system, said the tsunami that battered Japan on March 11 makes a compelling case to upgrade the warning centers.

NOAA put out a request for information one week after a magnitude 9.0 earthquake struck Japan last month, generating waves as high as 124 feet. Proposals to upgrade the Pacific Tsunami Warning Center in Honolulu and the West Coast and Alaska Tsunami Warning Center in Palmer, Alaska, are due April 19. Orcutt said the request for information reflects the recommendations in the academy report.

According to the report and the modernization technical roadmap in the RFI, both centers need upgrades with common sets of hardware and software tools, which both lack today.

The two centers also issue warnings using different message formats, and they should be standardized, Orcutt said.

Paul Whitmore, director of the West Coast and Alaska Tsunami Warning Center, said the independent system architectures of the two centers reflects the fact they were established independently, 18 years apart.

The Honolulu center went into operation in 1949, three years after a tsunami generated by a 7.2 magnitude earthquake hit the Big Island of Hawaii and killed 159 people, Whitmore said.

The West Coast center went into operation in 1967, in response to a magnitude 9.2 earthquake in 1964 that generated 27-foot tsunami waves, which wiped out the Alaskan village of Chenega, killing 23 of 68 residents. Post-quake tsunamis hit other Alaskan communities in Prince William Sound and caused damage from Hawaii to California.

The independent nature of the two centers, NOAA said in its RFI, means they operate disparate warning systems on different hardware and software platforms. The lack of a common information technology architecture results in higher hardware and software maintenance costs, and makes it difficult to share staff between the two centers or operate as fully functional backup sites for each other, NOAA said.

The centers issue their predictions based on earthquake data recorded by seismographs, tidal gauges and Deep-Ocean Assessment and Reporting of Tsunamis (DART) buoys, which measure sea levels. Until 2004, NOAA had only six DART buoys in operation in the Indian and Pacific Oceans.

But after a magnitude 9.2 earthquake off the coast of Indonesia on Dec. 26, 2004, generated tsunamis resulting in the deaths of 230,000 people, NOAA increased the number of DART buoys deployed to 39, Whitmore said.

The increased data flow from these buoys, along with some 500 tidal gauges and other instruments, means NOAA needs new technology to process masses of data, Whitmore said. According to the RFI, "The amount of data processed and the type and number of products have increased substantially since the establishment of the centers."

The Alaska center currently runs on a PC-based architecture using the Microsoft Windows XP operating system while the Honolulu centers runs on Unix architecture. NOAA wants to develop a common architecture that will make it easier to introduce technology for both centers, the RFI said.

NOAA also wants to develop common system software to ensure computational techniques and user interfaces are consistent between the two centers.

Orcutt estimated design of the architecture and purchase of new hardware would cost about $2 million, a small investment to improve the warnings of potentially devastating events.

NEXT STORY: Pentagon Walls Off Head Drug Study