Build a better test for your emergency response systems

There are a variety of techniques and tools that agencies can use to get the most out of the time and money they spend testing their emergency management systems.

With fears that swine flu could make a deadly spread to pandemic proportions, health and emergency management officials can’t help but wonder, despite their best efforts, how well their response plans will perform in the face of a real large-scale crisis.

Also in this report:

Independent evaluations

That uncertainty is one of the harsh truths of the contingency planning business, and it underscores the critical need to regularly test technology-based emergency systems and resources.

The rub is that the cost of a single exercise can easily run into the six figures for a large agency, and some might require multiple tests during a year, said Chip Hultquist, emergency management laboratory technical director of the National Security and Emergency Management Program at the Oak Ridge Associated Universities (ORAU), a consortium that works with U.S. national laboratories and agencies, including the Energy Department.

Expenses also can get unnecessarily high for organizations that lack standardized testing processes and must retrace steps for each test objective and scenario. But the ultimate cost would be people's lives and injuries if a response plan fails when called upon.

There are a variety of techniques and tools agencies can use to maximize their investment in testing emergency management systems.

For example, ORAU offers a free software application it developed for DOE to help plan, conduct and assess emergency management tests. A growing number of agencies, including the Veterans Affairs Department, are using the application. VA implemented a customized version called Exercise Builder--Hospital partly in response to the swine flu outbreak.

Testing NIMS

The cornerstone for most federal, state and local emergency response plans is the Federal Emergency Management Agency’s framework for best practices, called the National Incident Management System.

Released in March 2004 as a result of Homeland Security Presidential Directive 5, NIMS guides public agencies and first responders on how to launch emergency systems for reliable emergency communications and data sharing. It includes a set of standard principles, terminology and organizational processes to enable coordinated incident management at all levels of government.

One of the most critical pieces is a requirement for an emergency operations center (EOC) and technologies that help officials create a common operating picture for emergency responders. NIMS compliance has been a condition for federal preparedness grants since 2006.

Just as a consistent approach to emergency responses is the foundation of NIMS, consistency should be critical in testing IT systems that support emergency responses, Hultquist said.

Successful testing begins with defining exercise objectives. For example, tests should assess how well an EOC’s IT systems share and coordinate information with users, including those in other jurisdictions.

“Being able to link up your organizations under the incident command system and a unified command system is a big part of NIMS, so that’s something that you want to measure,” Hultquist said.

Another key element of NIMS is the master scenario events list (MSEL), a timeline of what should happen and when it should take place as an emergency unfolds. “You are looking for expected responses based on your testing objectives to judge the ability of these organizations to work well together and coordinate information,” he said.

To promote testing efficiency, ORAU worked with DOE’s Emergency Management Issues Special Interest Group (EMI SIG) to develop the Exercise Builder software application.

ORAU distributes the standard version of Exercise Builder for free to public agencies that want to use its templates for defining, creating and analyzing tests. The software guides test planners through the process of identifying components of the emergency management system to test and then saves the choices to speed this step when agencies conduct similar tests.

ORAU said one of its customers used Exercise Builder to cut testing costs by almost 50 percent during the past two years by reducing the amount of time staff members devoted to developing test plans.

“Agencies can preload their site-specific information into the program and then select various scenarios, such as a dirty bomb, a hurricane, or a hazardous material spill,” said Dorothy Cohen, DOE’s fixed facilities emergency management group manager and head of the EMI SIG. By selecting one of those categories, agencies will see preset test objectives and scenarios.

Important questions to ask when evaluating systems include:

  • Are systems consistently and effectively managing the emergency?
  • Do responders know what other colleagues are doing?
  • Does the system provide a common operating picture as NIMS calls for?

Stick to the script

When it’s time to perform an exercise, computerized scripts can help carry out important exercises, such as stress testing how well IT systems handle steep spikes in user traffic as anticipated during an emergency.

The scripts send requests to emergency response applications to replicate sudden demands for resources and information, which is more practical and less costly in staff time than asking hundreds of users to simultaneously log in to the system. The Virginia Department of Emergency Management uses NeoLoad from Neotys to simulate demand scenarios.

“We write the scripts based on what people will do during an augmentation, such as making requests for assistance, mission tracking or surfing the Internet to look up information,” said Bobbie Atristain, VDEM's IT director .

VDEM tests the Virginia Interoperability Picture for Emergency Response, one of the commonwealth’s main emergency response systems, each quarter. VIPER was designed for redundancy with six servers that can run the system's software, plus two additional Web servers and two database servers. The servers are separated geographically and can take on additional loads if companion hardware and software crashes.

That level of redundancy is necessary because the state’s EOC is responsible for critical activities, such as dispatching emergency medical flights. “Our mission-critical systems need to have zero downtime,” Atristain said.

VIPER aggregates data from a range of sources, including the National Weather Service and Virginia’s crisis management system, and it uses icons to represent the information on large screens in the EOC. The system serves a diverse constituency that includes state agencies that are a part of the Virginia emergency response team, federal entities and reservists.

“Our hope is they can quickly see there’s a bomb or a fire without having to go through a half-hour briefing about what’s going on and where,” Atristain said.

The quarterly tests of VIPER give VDEM a baseline measurement for how many people can log on at any given time. “If we found that any particular application can’t handle a huge load and it was mission critical, then we would limit access to who could actually log on to that machine,” she said.

To make the scripts as realistic as possible, Atristain interviewed users of the EOC to learn how they interacted with the system when a hurricane or other emergency hit. She also talked to the IT staff about their biggest complaints — and those of users — when technology resources were stretched during emergencies.

VDEM then used this information to devise test scripts that pushed the systems beyond even worst-case scenarios in terms of the number of users.

Those tests, in addition to research aided by the project manager of Virginia’s operations division, uncovered EOC system problems, such as crashes when about 50 people tried to use it simultaneously. VDEM made system design changes that have thus far avoided further breakdowns.

“It took stress testing and some trial and error over the course of a year to get the system to where it needed to be,” she said.

Although formal testing is essential to maintaining emergency systems, agencies shouldn’t overlook the benefits of informal testing, which they might be able to do more frequently and cheaply.

For example, when Virginia adds new servers or applications, staff members who need access to resources in that data center are automatically rerouted to alternative sites by technologies that are intended to maintain continuity of operations. The ease and speed of that switch offer an informative test of emergency operations.

She added that users typically aren’t informed about a scheduled shutdown. “If people know that, they will see things that aren’t there,” Atristain said. “So we shut down a location and see if everything falls into place.”

Post-mortem analysis

Applications such as Exercise Builder can also help agencies when testing is complete. DOE uses the software’s after-action reports to identify areas that require corrective action based on the test parameters and expected outcomes defined in the original objectives. Evaluators weigh any shortcomings to prioritize areas that need the most attention.

“The result is a document that [the agency can] live with for the next year to make sure that those things that didn’t work well are fixed,” Hultquist said.

DOE also uses the EMI SIG to disseminate information it gathers using Exercise Builder. At the EMI SIG’s annual meetings, which more than 300 DOE staff and contractors typically attend, a technology subcommittee gathers to discuss how various sites are dealing with emergency management issues, EMI SIG’s Cohen said.

Despite frameworks such as NIMS and electronic tools such as Exercise Builder and traffic-simulating scripts, agencies shouldn’t become so focused on IT and testing that they lose sight of the primary objective.

“You are not doing a test of an IT system; you are doing a test of how IT is applied as part of the bigger whole,” the overall response to an emergency, Hultquist said.

NEXT STORY: Independent evaluations