Government's use of analytic technologies and employment of data-focused digital natives are already saving billions of dollars that would otherwise be wasted, defrauded or spent inefficiently.
But these teams are still relatively new across government, and for many bureaucratic organizations, getting started is the hardest part. Last week, Nextgov hosted a panel with three of the federal government’s foremost analytic practitioners and asked them to share best practices for building a successful analytics shop.
I’ve distilled their feedback into four key takeaways, but first I want to share a brief tidbit about each expert to demonstrate their successes in analytics.
» Get the best federal technology news and ideas delivered right to your inbox. Sign up here.
Kelly Tshibaka, chief data officer at the U.S. Postal Service’s Office of the Inspector General, leads an analytics operation that recently snuffed out a $1 billion fraud scheme and last year contributed another $920 million in findings.
Rebecca Shea is the audit director at the Government Accountability Office’s Forensic Audits and Investigative Service. Last year, GAO addressed 851 congressional requests and mandates and identified $63.4 billion in financial benefits through its investigations, returning $112 to the Treasury Department for each $1 invested.
Avi Bender is the director of the National Technical Information Service, a new organization within the Commerce Department that aims to help provide analytic talent and tools to agencies faster than traditional acquisition processes allow. While new, NTIS is now working with several agencies, including the Health and Human Services Department, which recently teamed with the FBI and Justice Department in a $1 billion Medicare takedown.
“You’re not going to go anywhere if you don’t have an executive sponsor,” Tshibaka told me.
It could be someone in the C-suite, a well-positioned program manager or an agency head or subhead, but was what really matters if the sponsor has the ability to move resources around.
Sometimes, justifying the creation of an analytics shop or team of data scientists is difficult, but don’t make the case with words alone.
“Pictures and stories—that picture might be worth putting a few hundred thousand dollars into an analytics enterprise,” Tshibaka said. Pictures and visualizations from other examples of analytic successes across government can be highly persuasive, especially when those examples indicate real value to taxpayers.
Lastly, executive buy-in increases with each win, large or small. Share those stories internally and externally at every chance, Shea said. Big numbers are fine, but oftentimes, the best way to do that is with visualizations.
In other words, get used to showing people pictures that illustrate the work your analytics teams are up to.
The Right People
Shea’s forensic audit team consists of “lifelong learners who enjoy the work they do.” That’s important, but at GAO, this is not dumb luck. About eight years ago, she said, the team began partnering with GAO’s Applied Research and Methods team, merging skill sets among investigators and technically gifted individuals.
“We poached a good number of people from our ARM team, got those skills in our team, and we continue to do that,” Shea said, adding that sometimes, the ARM teach poaches back.
Shea used a reference from the TV show “The Big Bang Theory” to describe the cross-pollination that occurs between GAO’s various teams. Its ARM team, she said, “is a bunch of Sheldons,” meaning they’re introverts who geek out on data but don’t always communicate well. Working with other teams helps the Sheldons within GAO better translate their technical efforts.
In this regard, internal programmatic knowledge is important, and “you may not need to hire from the outside,” Shea said.
Meanwhile, the U.S. Postal Service, Tshibaka said, is hiring fewer “emotionally stunted Sheldons” and instead targeting digital natives, data scientists and business analysts with the ability to communicate across multiple domains. New tech talent is versed in programming languages like R and Python and suites like Microsoft 365, in addition to the Excel spreadsheets of yesteryear.
Bender suggested agencies seek out “catalysts for change” aligned with the agency’s mission and hell-bent on getting things done. These people are curious in nature would rather transform something than stick to the status quo.
Tools may be the easiest issue to address for federal officials creating new analytics shops because they’re so commonplace in the private sector, where business intelligence matters as much as anything. Much like shopping for cloud services in government, analytic offerings run the gamut. The best bet is to find one that suits your data, as Tshibaka succinctly explained.
“All the tools work, none are perfect; some are expensive and some are free,” she said.
Shea also stressed this point: “You don’t need the most expensive or shiniest tool: You need the one that’s most effective to your data. Let those two things guide the tools you need.”
Data is King
Data ought to be at the forefront of almost any agency mission, Bender said, but oftentimes, it’s just an afterthought.
Bender advises prospective data teams to re-read their agency mission statements and identify two or three “critical success factors” the agency must accomplish to achieve mission. Naturally, many—if not all—are data related.
“Most organizations are too busy at the functional and technical level, and then after that trying to align [technology] with the mission,” Bender said.
Rather than force analytics to the mission, “instantiate a consultative methodology” whereby key leaders and technical experts identify what data matters, why and for what ends.
“The data we have in the federal government tells an amazing story about our nation’s people and economy, we need to a better job of telling that story,” Bender said.
Tshibaka stressed the importance of getting access to important data sets as an initial hurdle many agencies and organizations must overcome. Yet, various agencies are subject to an abundance of legal and policy restrictions here, so consult the law and leadership before demanding immediate access to every data set. When challenges arise, they’re more likely to be human in nature and not technical, and if so, the executive buy-in we discussed above can help mitigate them.
Lastly, it’s not just access to data that matters; it’s the quality of the data accessed. Shea told me the adage “garbage in is garbage out” applies well to the government’s data sets. Data sets that aren’t complete, standardized or accurate lead to faulty conclusions or insights.