Tips for Deploying Analytics at the Federal Level

Syda Productions/Shutterstock.com

It almost seems like big data analytics is The Promised Land that’s forever over the next hill for federal agencies.

Even though the federal government is one of the biggest—if not the biggest—producers of data, the public sector seems to be falling behind implementing big data programs, according to a 2016 report published by Delft University of Technology academics.

While federal agencies have done a good job of collecting data, they’ve had less success in actually deriving insights from that data via analytics systems.

It almost seems like big data analytics is The Promised Land that’s forever over the next hill for federal agencies. However, rather than focus on the lack of an analytics capability at the federal level, or the reasons why federal agencies have fallen behind their private-sector counterparts, I’d like to offer some prescriptions for what can be done.

While growing up, I was taught the old adage that “failure to plan is planning to fail.” That’s as true with regard to analytics as anything. Before even considering the idea of implementing an analytics program, agency heads need to ensure they have buy-in from key stakeholders. Simply put, without adequate buy-in, nothing else can be accomplished. In order for real progress to be made, agency leaders and IT professionals need to come together to define realistic, measurable, objectives for an analytics system and what it will cost.

Such a conversation should include a discussion of both direct costs, such as the purchase price of software and the time it will take your engineers to install and maintain it, as well as indirect costs, such as the training public-facing employees will require in order to correctly use the system. While this is a time-consuming process that will undoubtedly delay implementation, failure to do so is a virtual guarantee of further problems and disappointments later down the road.

Once objectives and costs are defined, policies must be implemented to ensure relevant data is collected and stored in such a way that is useful for analysis. This step is where federal agencies (and private businesses) sometimes stumble on the block of so-called “data siloing,” where different pieces of a technology stack don’t talk to one another. The solution is good data governance.

My colleague Mark Hensley has written about the importance of data governance. Without rehashing all the details, the key thing to remember is that there needs to be uniformity across the organization in how data is inputted. For example, a simple difference such as entering a citizen’s address as “Hamilton, OH” instead of “Hamilton, Ohio” can lead to duplicate records, information getting lost and members of the public getting frustrated.

Finally, we come to the question of privacy. Members of the public are rightfully wary of how their personal data is being used. While Facebook and other large social media companies are now discovering they’re not exempt from these questions, the concern is even more pronounced with regards to governments, which can literally hold the power of life or death over their citizens.

In practice, this means certain types of analyses, such as those involving individual health records, are likely to remain permanently off limits. However, the beauty of analytics is that it is by definition a process of aggregation. One, two or even 10 data points are unlikely to be of much help in drawing conclusions, but 100 or 1,000 are. Through the process of data masking, analysts can obscure fields containing data about specific individuals and instead receive an accurate, but anonymized, picture of the data as a whole. By doing so, federal agencies can satisfy citizen concerns while simultaneously reaching insights that allow them to better serve the public. Indeed, the majority of federal agencies have already implemented, or have plans to implement, data masking techniques.

In conclusion, while the potential benefits of an analytics system for federal agencies are considerable, implementation faces several obstacles in the form of organizational disconnect, poor data management and ethical questions. Yet these are not insurmountable, and, at the risk of joining a long line of failed prophets, I fully expect to continue seeing enterprising agencies leverage analytics in new and exciting ways to cut costs and improve services in the coming years. However, the ones most successful in doing so are likely to be those that possess a clear understanding of the benefits and costs of their analytics implementation, policies in place to adequately collect, store and label relevant data and the ability to conduct analysis in a responsible way that serves the needs of the public at large.

Brett Swartz is the director of public sector for Liferay.