The scientific method applied to IT management

To manage IT well, follow the lessons of one of the pioneers of the Industrial Revolution, advises consultant Warren Suss.

To manage technology effectively, agency leaders would be wise to follow the lead of Frederick W. Taylor, a father of scientific method considered to be one of the very first management consultants, said Warren Suss in opening remarks at the Federal Networks 2011 conference.

Taylor (1856-1915) wrote “The Principles Of Scientific Management” and his exacting measurement, analysis and training of the workforce was of great influence in ushering in the second wave the of Industrial Revolution, said Suss, president of Suss Consulting, a co-sponsor of the conference. Suss took Taylor’s lesson and applied it to the current problems facing government IT projects and the challenges facing contractors in the current fiscal environment.


Related coverage:

Quick! Grab the org charts!

Why best practices won't fix federal IT


“In dealing with this added oversight from Congress and the administration, IT leaders will do well to borrow a page from Taylor’s playbook,” Suss said. “The argument is well established – there’s the choice between the high costs of doing nothing – limiting mission performance by sticking with inefficient tools and methods – versus achieving improved results by leveraging next-generation technologies and processes.”

Taylor was a great proponent of the ability to diligently quantify performance. Suss said that Taylor’s favorite tool was a stopwatch. “The problem here, of course, is that we’re not manufacturing pig iron, and the output of our IT investments is tough to measure with a stopwatch,” Suss said. Taylor could measure a factory’s output in tons of pig iron, which was the real payoff, not just in cost savings.  Until we’ve done a better job measuring the output side of the federal IT investment equation, we’re stuck with a losing game of targeting diminishing cost savings by cutting our investment in the government.”

Suss allowed us to publish the entire speech, which you'll find on the next page of this article.

Opening remarks of Warren Suss, president of Suss Consulting, Inc., to the Federal Networks 2011 conference held at the Hilton McLean, in McLean, Va., on Feb. 15:

Good morning. Welcome to the 24th annual Federal Networks Conference.

100 years ago, Frederick W. Taylor published "The Principles of Scientific Management." I think of Taylor as the first management consultant. Others think of him as the father of industrial engineering. Today I’d like to discuss how Taylor’s work and his legacy can help us address some of the toughest challenges in today’s federal IT marketplace.

Taylor advocated the application of scientific methods to the workplace. His approach was centered on time and motion studies. His techniques for optimizing workplace productivity contributed to the rise of the United States as a global industrial power. His introduction of quantitative, analytical tools to re-engineer business processes has now been extended and adopted as a central part of the canon of today’s management theory and practice. His focus on efficiency is now, 100 years later, deeply embedded in the modern worldview. It has become a core value in today’s corporate world and frames the way we organize our day-to-day personal activities.

Many of us remember the father from "Cheaper by the Dozen," Frank Gilbreth who, along with his wife, Lillian, organized his 12 children’s day-to-day activities according to the principles of scientific management. Gilbreth, also a consultant, spread the gospel of scientific management through a refined version of Taylor’s approach and laid the foundations for today’s continuous process improvement methodologies using a systematic study of motions called “Therbligs”, a term based on Gilbreth’s name spelled backward. Another Taylor disciple, Henry Gantt, developed the charts that we still use today to plan project tasks, identify interdependencies and manage schedules.

A revolution in the industrial mechanization of manual labor automated Taylor’s search for efficiency and powered American industry to world dominance. My great-grandfather, Samuel Persky, played a small part in this revolution when he came to Brooklyn from Russia at the turn of the 20th century and landed a job with a candy company where he developed machinery to automate manual processes for making chocolate-covered cherries and Jordan almonds.

Taylor’s main tool was the stopwatch, but he then applied improved analysis, planning, training and supervision to boost the productivity of pig iron manufacturing. Today, we’re here to explore how we can make better use of our main tools – networks and related information technologies – to improve the efficiency and effectiveness of government and military operations.

In Taylor’s time, pressures to keep a lid on railroad rates for interstate commerce were a key factor driving the adoption of scientific management in iron manufacturing. Today, pressures to reduce government spending will take center stage and will drive important changes and create big opportunities and risks in federal IT. The change in Congress, record deficits accelerated by the stimulus program, and limited near-term prospects for increased revenues due to the lingering effects of the recession and the recent tax compromise will result in unrelenting pressure to reduce the federal government’s spending. Our central challenge as a community is develop a creative response to budget pressures – a response that will have long-term, positive impacts on the efficiency and effectiveness of our government and military institutions, just as Taylor’s response to the railroad rate pressures of his day led to a transformation in the productivity of American industry. Our central challenge as competitors for government contracts is not just to weather the storm brought on by today’s budget pressures but to adapt more quickly and more effectively than the competition during the coming period of marketplace disequilibrium.

Today’s budget pressures will shape the big upcoming federal IT initiatives that our speakers will be talking about today and tomorrow, as well as the growing number of smaller task-order competitions on indefinite-quantity, indefinite-delivery contracts.

This year, political, organizational and technological forces will reshape the environment to put the spotlight on new large, high-profile IT initiatives – what the Defense Department calls “programs of record." From the political side, oversight committees in Congress -- particularly in the House -- will be looking for opportunities to hold hearings, write reports and pressure the administration to limit government spending on programs of record. Large IT initiatives already launched that experience overruns, delays or technical problems will be easy targets, and any new large “programs of record” will draw fire from the budget-cutters.

The Office of Management and Budget and the Pentagon will lead the Obama administration’s IT cost-savings drive. OMB has its dashboards tuned to zero in on government’s top-dollar IT works in progress. On the Defense Department side, Secretary Robert Gates’ five-year plan to cut $100 billion out of the department’s budget to free up funding for combat readiness will give even greater clout to watchdog IT policy and oversight organizations including the undersecretary of defense for acquisition, technology and logistics, or AT&L, as well as Office of the Secretary of Defense for Cost Assessment and Program Evaluation (CAPE), formerly known as Program Analysis and Evaluation (PA&E).

This increased oversight from the political and administrative sides will introduce new levels of budget and schedule uncertainty into large IT programs. The delays, stops, starts and changes we’ve seen recently on big DOD and civil agency acquisitions foreshadow even more risks for tomorrow’s big initiatives. OMB, AT&L and CAPE are masters at implementing centralized IT oversight’s law of unintended consequences. In the name of improving the management of IT portfolio investments, they can slow down the approval of important new initiatives. In the name of strengthening IT project management, they can add redundant layers of time-consuming review and reporting requirements, second-guessing the front-line managers who usually have a better understanding of the customer environment and the day-to-day technical, cost and schedule challenges than White House and Pentagon staffers.

In dealing with this added oversight from Congress and the administration, IT leaders will do well to borrow a page from Taylor’s playbook. In the current political environment, the language of improved productivity is the best way to sell or defend large programs. The argument is well established – there’s the choice between the high costs of doing nothing – limiting mission performance by sticking with inefficient tools and methods – versus achieving improved results by leveraging next-generation technologies and processes.

The problem here, of course, is that we’re not manufacturing pig iron, and the output of our IT investments is tough to measure with a stopwatch. As a result, there’s a tendency in the federal space to focus our measurements on the cost side of the equation. Taylor could measure a factory’s output in tons of pig iron, which was the real payoff, not just in savings through personnel cuts. Until we’ve done a better job measuring the output side of the federal IT investment equation, we’re stuck with a losing game of targeting diminishing cost savings by cutting our investment in the government and contractor personnel, services and systems used to run our IT shops and networks.

Instead, we need to tackle the tougher job of measuring output in terms of meaningful results, clearly tied to the federal and military mission. We also need to communicate these results more effectively to our agency leadership, to Congress, and to the American people. Until we do a better job applying our analytics and technologies to develop and communicate credible, powerful measurements of the return from our IT investments, such as improved health outcomes, reductions in tax fraud, minimization of battlefield casualties, and amelioration of environmental pollution, we’ll be consigned to a fate of slow IT program deaths by a thousand budget cuts.

From an industry perspective, budget pressures and increased oversight will result in shrinking opportunity pipelines. Now this is a big challenge, but not for the obvious reason. Yes, there will be fewer new mega-deals coming up, but it’s not that the total size of the federal IT market is shrinking. True, the rate of growth in total IT spending is leveling off, but according to OMB data, the government still plans to spend $78.5 billion on federal IT in fiscal 2011, a slight rise from IT spending in fiscal 2010. What’s really happening is that the government is shifting its emphasis from large programs of record to smaller incremental spending. In effect, yesterday’s marketplace was characterized by a higher ratio of big deals to incremental spending than tomorrow’s marketplace. Industry’s opportunity pipelines tend to focus on the big deals and are generally blind to incremental opportunities.

In order to win in this transformed marketplace, industry needs to do two things. First, it needs to compete more aggressively and effectively to win the shrinking number of big “programs of record." Second, it needs to transform their marketing and sales strategies and tactics to position more effectively for the relative increase in incremental spending.

To stay ahead of the pack as the number of big deals shrinks, borrow a page from Taylor’s principles of scientific management. Begin by taking a more objective approach to building and prioritizing your federal opportunity pipelines and fine-tuning your capture plans. Pipelines and capture plans look like straightforward management tools – a list of opportunities and detailed outlines of the strategies and tactics for pursuing them. In reality, they are often political documents. Though they’re packaged with the wrapping of scientific management, including at least one Gantt chart, when you look inside, they’re often artful works of fiction. If sales are down, just throw some more opportunities onto the pipeline whether or not they’re a good fit with core competitive strengths. As you expand your list of targeted deals, gloss over the fact that you’ve just cut back on your capture and proposal resources. If the big capture initiative is at risk of a no-bid decision in your next gate review, gloss over some bad news about the opportunity from your customer or put a more positive spin on some key company weaknesses in your capture plan. These fictions are doubly destructive. They distract management attention from true opportunities, and they dilute scarce proposal and capture resources.

Instead, during tough times, you need to do a more honest, objective job -- dare we call it scientific? -- of assessing each opportunity’s fit with your company strengths and aligning your capacity with your growth objectives. Gather more accurate data on the strengths and weaknesses of a hungry competitive field. Improve your approaches to assessing and updating information on your customer’s technical requirements, acquisition strategies and opportunity timing. Make sure that your proposal and capture shop is sized right to handle the demands of your pipeline. Be sure your projected win probabilities are adjusted to account for the increased competitive intensity resulting from the decline in the number of new large deals. And bring a new level of rigor to your gate reviews, so the decisions on when to hold 'em and when to fold 'em feel more like card counting and less like counting on Lady Luck.

Most important of all, adapt Taylor’s scientific methods to your pricing strategies. Doing a better job at pricing your federal proposals will be the single most important factor determining whether you’ll end up as a winner or a loser in today’s federal IT marketplace. Effective pricing begins at the corporate and divisional levels, where today’s increasingly competitive environment calls for fundamental changes to overhead, fringe and G&A levels. You can’t succeed without help from “corporate” to align the loadings they place on your bids with today’s competitive realities.

Your friends from corporate can help you get within striking distance of a winning price, but you’ll also need a new, more disciplined approach -- again, dare we call it scientific? -- to getting the rest of the way to victory. Is your vice president populating the teams for your big proposals with the companies of his or her golfing buddies rather than picking team members based on who can get you to the most aggressive bid price? Are you waiting until two weeks before the proposal is due, when your engineers have finally completed all the details of your technical solution, to bring in your pricing people in to cost out the bid? Are you sharpening your pencil for all of your price drivers – loadings, labor cost, equipment cost, spares and maintenance, warrantees, fees, subcontractor loadings, escalation, and procurement overhead? In today’s marketplace, you’ll need all of Taylor’s tools -- measurement, analysis, planning, training and supervision -- to design and implement more aggressive, innovative pricing strategies.

Both government and industry will need to come to terms with another transformation brought on by the government’s drive for greater efficiency and cost effectiveness: the transition from time and materials contracting to firm fixed-price, performance-based acquisition strategies. This change, which has been in the works for a while, will be accelerated by today’s budget pressures, to bring new levels of risk to both government and industry.

Now I believe that the risks of fixed-price, performance-based contracting are best understood using the lens of Oliver Williamson’s concept called “bounded rationality." This cornerstone of behavioral economics refers to the impossibility of writing a contract that anticipates all significant contingencies, except when contracting for simple commodities. I spoke about this in my opening remarks at last year’s conference. Now in my book, Taylor is to Williamson as Newton is to Einstein. Both great thinkers, both very useful, but their work addresses different ends of the problem. I don’t plan to revisit Williamson today, except to reiterate that improved governance mechanisms are one way to address the many bounded rationality challenges inherent in fixed-priced contracting. I would like to look at how Taylor’s scientific management legacy can give us some helpful tools to pave the way to making firm fixed-priced, performance-based contracting work.

One way government and industry can manage the fixed-priced, performance-based contracting risk is to do a better job capturing and analyzing basis-of-estimate data. Taylor and his disciples were obsessed with measurement. They built their recommendations on extensive data collection calibrated in seconds, inches, and Therbligs. Their productivity targets were based on painstaking data analysis, and so were their supervision methods. Today, in too many cases, neither government nor industry has adequate data bases for estimating the level of effort required to address complex IT project challenges. As a result, firm fixed-priced, performance-based proposals have become big league games of liar’s poker: Both sides are shooting in the dark, playing “bet your company” or “bet your government program."

In the Taylor tradition, we need focus on how to drive costs down and maintain or increase performance over time. The companies that plan and manage their firm fixed-priced, performance-based contracts to meet performance objectives while continuously driving down costs will have the luxury of either capturing the savings as profit or enhancing the service features and benefits delivered to their customers. In a fixed-priced, performance-based world, Gilbreth’s notion of continuous process improvement isn’t just a quality assurance goal; it’s also the key to competitive advantage. So the goal of industry players should be to keep ahead of the pack by moving along the productivity curve at a greater speed than the other guys, and the goal of the government should be to set up the rules of the game to encourage this race to greater productivity. It will require big changes on both sides, but the easy way out -- to pre-specify head count, throw in small performance incentives, and ask for fixed priced bids -- sets a floor on cost-reduction opportunities, since labor is almost always industry’s big cost driver. It’s not just old wine in new bottles, but it freezes today’s inefficient processes and encourages a competition for low-priced, minimally qualified labor rather than for continuous, mission-focused productivity improvements in the Taylor tradition.

Earlier, I talked about my Great Grandpa Sammy’s machines for making chocolate-covered cherries and Jordan almonds as an example of how the revolution in the industrial mechanization of manual labor automated Taylor’s search for efficiency at the turn of the 20th century. Well today, two of our top federal IT initiatives -- data center consolidation and cloud computing -- hold similar promise for leveraging our investment in IT technologies to deliver greater efficiency and effectiveness in performing the federal and military mission. But here, again, we need to keep our eye on Taylor’s productivity target. The government is now committed to consolidating today’s 2,000-plus federal data centers. The plan is to invest in a handful of highly efficient enterprise-level government data centers, along with standards-compliant commercial data centers and Internet-based software-as-a-service offerings, to accelerate the government’s move into cloud computing. Here, again, it will be easier to target the cost reduction side of the productivity equation.

Yes, we have the opportunity to reduce the number of data center butts in seats. Yes, we’ll be able to reduce costs for software licenses. But the real potential of these initiatives is to enable the competitive race for Taylor-inspired productivity enhancement we’ve been talking about today. Our real bottom line, in federal and military operations, isn’t how efficiently we operate our data centers. It isn’t how much money we save on software. It’s how we can leverage IT investments like data center consolidation and cloud computing to enhance the quality of the government services we deliver to our citizens. It’s how we can use our technologies to get better, faster results from our humanitarian or military operations. We’ll need to use all of Taylor’s tools -- improved measurement, analysis, planning, training and supervision -- to achieve a 21st century revolution in meaningful results for our federal and military institutions and for the American people.

Thank you.