What's on your dashboard?

Agencies are drowning in data, but tools and strategies to make sense of it all are starting to emerge.

Afghan vote map

The U.S. Agency for Internation Development mapped voting, violence, signs of fraud and weather in its monitoring of the presidential election in Afghanistan in 2009.

“If you can’t measure it, you can’t manage it.”

So said management expert Peter Drucker, who clearly has some fans in federal IT. The Obama administration’s Office of Management and Budget pushed first TechStat and now PortfolioStat to better measure executive-branch technology investments, and officials said budget submissions for fiscal 2014 “should include a separate section on agencies’ most innovative uses of evidence and evaluation.”

“Many potential strategies have little immediate cost,” wrote Acting OMB Director Jeffrey Zients in the May 2012 budget memo, “and the budget is more likely to fund requests that demonstrate a commitment to developing and using evidence.”

The embrace of analytics goes far beyond OMB and IT spending, however. Agencies are also building systems to measure staff performance, identify both tax fraud and innocent mistakes, predict and prevent crime, ensure proper processing of health insurance claims, and even improve weather forecasts.

Why it matters

Enthusiasm for analytics has ebbed and flowed for decades, tracking rather closely with how much influence MBA types wield in the executive branch. (Zients’ degree is in political science, but he began his career in management consulting at the same Bain and Co. that gave Mitt Romney his start.) Today, however, the surge of measure-to-manage efforts also reflects the reality that there is a lot of data available.

In mid-October, there were 378,529 datasets on Data.gov. The Census Bureau alone has more than 550 million electronic files and 800 terabytes of accessible data. Throw in economic indicators, signals intelligence, satellite imagery and more than 400 million tweets per day, and “big data” begins to sound like an understatement. Harvard Business Review recently reported that “about 2.5 exabytes of data are created each day,” enough to fill roughly 2.6 million 1T hard drives. And “that number is doubling every 40 months or so.”

All that data has trained the public to expect more transparency, while data visualizations and widely used dashboards such as Google Analytics have shown just how useful properly processed statistics can be.

Perhaps most important, the pressure on agencies to deliver results with limited resources is greater than ever. That was certainly the impetus for PortfolioStat, which requires agencies to review their entire IT investment portfolios and identify opportunities to save money by sharing services, making commodity IT purchases and canceling low-priority projects.

“The federal government must focus on maximizing the return on American taxpayers’ investment in government IT by ensuring it drives efficiency throughout the federal government,” Zients and Federal CIO Steven VanRoekel wrote to agency executives in March.

Next steps

How to plan and launch an analytics project:

1. Identify a tightly defined, mission-critical goal. It should be important enough to justify the effort that analytics will require and ensure that senior executives are invested in the process, but not so big that people are afraid to tackle it.

2. Determine the questions whose answers could help reach that goal. Successful analytics projects gather the relevant data and develop metrics to meet a clearly defined goal.

3. Inventory the data you already have and identify any that you must start collecting. Look beyond your agency — and beyond government, for that matter — when building your dataset.

4. Pick the appropriate tools to capture and analyze the data and share the resulting analytics with decision-makers. The final presentation should be simple and clear. A data scientist might be required to develop the analytics, but managers and executives should not have to be statisticians to understand and use them.

5. Revisit and re-evaluate the questions and metrics on a regular basis. Getting the right statistics can be difficult, and framing the objective properly is even harder.

Legal obligations reinforce the pressure from OMB. A new report from the Partnership for Public Service and the IBM Center for the Business of Government, titled “From Data to Decisions II: Building an Analytics Culture,” notes that the Government Performance and Results Modernization Act of 2010 “calls for agencies to focus on high-priority goals and create a culture where data and analytics play a larger role.”

According to Robert Dolan Jr., worldwide industry executive for public-sector business analytics at IBM, federal officials are turning to analytics for a number of reasons: to improve accountability, drive smarter decision-making, build a culture of results-based government, “achieve the best outcomes for everyone, from everyone” and, of course, “to spend public funds responsibly.”

The fundamentals

Experts from the public and private sectors agree that successful analytics projects require far more than data. The right software and data science expertise are also vital, but the most important ingredient is an executive-level commitment to data-driven decision-making.

“Technology gets you only so far,” Dolan said. “You have to have the process and the culture.”

Belinda Seto, deputy director of the National Institute of Biomedical Imaging and Bioengineering, agreed. Her organization, which is part of the National Institutes of Health, has used data analytics to assess research portfolios and shape funding decisions.

“The staff knows I love data,” Seto told “From Data to Decisions” researchers. “They say, ‘You can’t go see Belinda without data,’ so now they bring data.”

Finding the talented professionals who can turn data into insights is also crucial — and a significant challenge. According to Harvard Business Review, data scientists have the sexiest job of the 21st century, but they are not beating a path to the government.

Agencies that have taken the lead on analytics have found ways to build the needed expertise. Some strategies include formal and informal interagency partnerships and temporarily reassigning skilled employees to positions at other agencies. The GeoNerds Meetup group that gathers monthly in Washington, D.C., finds employees from the Census Bureau, Energy Department, National Oceanic and Atmospheric Administration, and other agencies comparing notes with spatial analysts and developers from the private and nonprofit sectors. And the Partnership for Public Service and IBM Center researchers noted that the Defense Department has “created a scientific test and analysis technique center of excellence.... Essentially, the center is training people to be data detectives.”

When it comes to the data, Michael Mauboussin, chief investment strategist at Legg Mason Capital Management and an adjunct professor at Columbia Business School, stresses how easy it is to pick the wrong statistics to track. Most executives “have a gut sense of what metrics are most relevant,” he wrote in an article published in Harvard Business Review in October, “but they don’t realize that their intuition may be flawed.”

According to Mauboussin, good statistics have two qualities: “They are persistent, showing that the outcome of an action at one time will be similar to the outcome of the same action at a later time, and they are predictive, demonstrating a causal relationship between the action and the outcome being measured.”

The authors of the Partnership for Public Service/IBM Center report agreed. “To travel down the analytics road, managers must challenge time-worn assumptions and embrace qualitative measures that are linked to impact,” they wrote. “As with any new activity, managers need to be comfortable experimenting and learning.”

And finally, there are the tools themselves, which run the gamut from simple spreadsheets in Microsoft Excel to staggeringly powerful and complex systems from SAP, IBM, EMC and others.

Jeffrey Zients

Budget requests that show a commitment to develop and use evidence are more likely to get funded, says Acting OMB Director Jeffrey Zients.

“There isn’t really a product per se that’s labeled ‘the big data project,’” said Steve Lucas, global executive vice president of SAP’s Database and Technology division. “It doesn’t quite work like that.”

Despite the precedents set by the federal IT Dashboard and consumer services such as Google Analytics, chart-laden dashboards are not a requirement for analytics. When the Federal Emergency Management Agency wanted to assess the effectiveness of its disaster services, for example, the effort’s first output was a simple flowchart documenting what disaster recovery entails for the organization, and even today some of the core number-crunching is done in Excel. TechStat and PortfolioStat are both built around structured in-person meetings, with the resulting data flowing into an OMB-provided template.

Some metrics and analysis, however, beg for visualization, particularly when they involve location data. Matt Gentile, Deloitte’s principal for geospatial analytics, contended that “the map is becoming the dashboard.” Geospatial tools allow for more sophisticated analyses, he said, and enable agencies to blend multiple layers of data as they measure impact and make decisions. He added that a wide variety of commercial and open-source mapping options are available, many of which can be plugged into broader analytics systems or used on a stand-alone basis.

At the U.S. Agency for International Development, for example, geoanalytics were crucial to election-monitoring efforts in Afghanistan. The results — which combined vote tallies, incidents of violence and signs of fraud — were eventually published for all to see, but the original and primary purpose was to create “an intranet connecting D.C. to Kabul,” said MapBox CEO Eric Gundersen, whose firm built the open-source system.  “Staff on the ground could flag potential problems and easily send them around for all to see.”

The hurdles

According to Dolan, the biggest challenge for most agencies is simply getting started. There is a misconception that analytics must make sense of everything, he said, but “the best way to eat the elephant is one bite at a time.”

Bethann Pepoli, chief technology officer of EMC’s State and Local Government Division, agreed. “Instead of reacting to every situation and every request, we’re suggesting that you be more proactive in identifying challenges and the solutions you want,” she said.

Lucas said agencies should resist the strong temptation to create a strategy around the data they already have rather than the questions they need to answer. “There’s probably a little too much bias toward what you have in your pocket,” he said, but real insights often require sources from outside the agency. “We know that any real breakthrough...is always some kind of a mashup.”

Failure to clearly explain those goals and get departmental buy-in can also cripple a project. TechStat, the precursor to PortfolioStat that focused on individual IT projects, has faced that challenge. As a government official in the “From Data to Decisions” focus groups put it, “There’s still very much a culture of fear of metrics — fear that the data can be used against your program.”

Similarly, analytics projects must be designed so that they can actually be put to use; a system that requires the agency to reinvent its operations will likely face an uphill climb. A recent Harvard Business Review article warned that analytics can run into the same problems that plagued big customer relationship management projects in the 1990s: “The systems remained stubbornly disconnected from how...frontline managers actually made decisions, and new demands for data management added complexity to operations.”

And finally, there is the data being analyzed, which must be right on three different levels: factually accurate, properly structured and relevant to the goals being pursued. The analytics team can handle the data structure, but the other two elements require close collaboration with program staff and subject-matter experts.