Weâ€™ve posted a couple of items recently (here and here) about how vital disater recovery plans and independent verification are for the success of IT systems. We came across another example that highlights the pitfalls of poor safety planning -- and sometimes the dumb luck that comes into play to avoid them.
In his cheeky autobiography, Noble Prize-winning physicist Richard Feynman recounts how, when the United States worked on the Manhattan Project during WWII, he was sent to verify safety procedures at the Oak Ridge uranium processing plant. He found the facility on the verge of exploding (he nearly read the riot act to them). When Feynman returned to the plant several months later, he toured the facility with a military escort. The two entered a room, where there were "these two engineers and a loooooong table covered with a stack of blueprints," Feynman wrote.
The engineers explained that the plant had been designed to have backup safety valves so if any one valve failed, the backup valve would take over and avoid disaster. Feynman looked over the blueprints, and, relates, "Iâ€™m completely dazed! Worse, I donâ€™t know what the symbols on a blueprint mean! There is some kind of a [symbol] thing that at first I think is a window."
Feynman points to one of the mysterious window-like symbols and asks the engineers what happens if that valve gets stuck, thinking that the engineers will tell him that that symbol is not a valve, itâ€™s a window. Instead the engineers eye each other and start discussing what would indeed happen.
The engineers "turn around to me and they open their mouths like astonished fish, and say, â€˜Youâ€™re absolutely right, sir,'" Feynman wrote. "So they rolled up the blueprints and away they went and we walked out."
Feynmanâ€™s military escort demanded to know how on Earth he could have known that one part of the blueprint design was faulty. Feynman wrote, "I told him, you try to find out whether itâ€™s a valve or not."
NEXT STORY Semantic Web a Bust