recommended reading

NOAA eyes upgrade of outdated tsunami warning centers

The two tsunami warning centers the National Oceanic and Atmospheric Administration operates are "remarkably outdated" and need upgrades, said John Orcutt, professor of geophysics at the Scripps Institution of Oceanography in La Jolla, Calif.

Orcutt, who chaired a National Academy of Sciences committee that issued a report in September 2010 on the nation's tsunami warning and forecast system, said the tsunami that battered Japan on March 11 makes a compelling case to upgrade the warning centers.

NOAA put out a request for information one week after a magnitude 9.0 earthquake struck Japan last month, generating waves as high as 124 feet. Proposals to upgrade the Pacific Tsunami Warning Center in Honolulu and the West Coast and Alaska Tsunami Warning Center in Palmer, Alaska, are due April 19. Orcutt said the request for information reflects the recommendations in the academy report.

According to the report and the modernization technical roadmap in the RFI, both centers need upgrades with common sets of hardware and software tools, which both lack today.

The two centers also issue warnings using different message formats, and they should be standardized, Orcutt said.

Paul Whitmore, director of the West Coast and Alaska Tsunami Warning Center, said the independent system architectures of the two centers reflects the fact they were established independently, 18 years apart.

The Honolulu center went into operation in 1949, three years after a tsunami generated by a 7.2 magnitude earthquake hit the Big Island of Hawaii and killed 159 people, Whitmore said.

The West Coast center went into operation in 1967, in response to a magnitude 9.2 earthquake in 1964 that generated 27-foot tsunami waves, which wiped out the Alaskan village of Chenega, killing 23 of 68 residents. Post-quake tsunamis hit other Alaskan communities in Prince William Sound and caused damage from Hawaii to California.

The independent nature of the two centers, NOAA said in its RFI, means they operate disparate warning systems on different hardware and software platforms. The lack of a common information technology architecture results in higher hardware and software maintenance costs, and makes it difficult to share staff between the two centers or operate as fully functional backup sites for each other, NOAA said.

The centers issue their predictions based on earthquake data recorded by seismographs, tidal gauges and Deep-Ocean Assessment and Reporting of Tsunamis (DART) buoys, which measure sea levels. Until 2004, NOAA had only six DART buoys in operation in the Indian and Pacific Oceans.

But after a magnitude 9.2 earthquake off the coast of Indonesia on Dec. 26, 2004, generated tsunamis resulting in the deaths of 230,000 people, NOAA increased the number of DART buoys deployed to 39, Whitmore said.

The increased data flow from these buoys, along with some 500 tidal gauges and other instruments, means NOAA needs new technology to process masses of data, Whitmore said. According to the RFI, "The amount of data processed and the type and number of products have increased substantially since the establishment of the centers."

The Alaska center currently runs on a PC-based architecture using the Microsoft Windows XP operating system while the Honolulu centers runs on Unix architecture. NOAA wants to develop a common architecture that will make it easier to introduce technology for both centers, the RFI said.

NOAA also wants to develop common system software to ensure computational techniques and user interfaces are consistent between the two centers.

Orcutt estimated design of the architecture and purchase of new hardware would cost about $2 million, a small investment to improve the warnings of potentially devastating events.

Threatwatch Alert

Thousands of cyber attacks occur each day

See the latest threats

JOIN THE DISCUSSION

Close [ x ] More from Nextgov
 
 

Thank you for subscribing to newsletters from Nextgov.com.
We think these reports might interest you:

  • Modernizing IT for Mission Success

    Surveying Federal and Defense Leaders on Priorities and Challenges at the Tactical Edge

    Download
  • Communicating Innovation in Federal Government

    Federal Government spending on ‘obsolete technology’ continues to increase. Supporting the twin pillars of improved digital service delivery for citizens on the one hand, and the increasingly optimized and flexible working practices for federal employees on the other, are neither easy nor inexpensive tasks. This whitepaper explores how federal agencies can leverage the value of existing agency technology assets while offering IT leaders the ability to implement the kind of employee productivity, citizen service improvements and security demanded by federal oversight.

    Download
  • Effective Ransomware Response

    This whitepaper provides an overview and understanding of ransomware and how to successfully combat it.

    Download
  • Forecasting Cloud's Future

    Conversations with Federal, State, and Local Technology Leaders on Cloud-Driven Digital Transformation

    Download
  • IT Transformation Trends: Flash Storage as a Strategic IT Asset

    MIT Technology Review: Flash Storage As a Strategic IT Asset For the first time in decades, IT leaders now consider all-flash storage as a strategic IT asset. IT has become a new operating model that enables self-service with high performance, density and resiliency. It also offers the self-service agility of the public cloud combined with the security, performance, and cost-effectiveness of a private cloud. Download this MIT Technology Review paper to learn more about how all-flash storage is transforming the data center.

    Download

When you download a report, your information may be shared with the underwriters of that document.