Experts at an Urban Institute event noted that the decline of data sources like surveys has created a need for improved information sharing capabilities.
Data infrastructure needs to change to better support evidence-based policies, according to panelists at an Urban Institute event on Monday.
This call for change comes due to requirements in the Evidence-Based Policymaking Act of 2019—or the Evidence Act—under which federal agencies must develop evidence to support their policymaking.
“Credible statistical information supports a democratic society, because in a democratic society you’re pushing decision-making down to the lowest possible level, to the most granular level. So this informs decisions by government,” Erica Groshen, senior economics advisor at Cornell University School of Industrial and Labor Relations, said.
The current evidence-collection model relies on surveys, but Groshen emphasized the need to move beyond this model and leverage technology, big data and blending data—or combining data from multiple sources.
“We need a new 21st century data infrastructure because we face a very big threat,” Groshen said. “All survey response rates are falling, business surveys, household surveys are all falling. This raises costs of gathering that information, and it erodes the reliability of that information, raising the possibility of biases and also, given the data, larger standard errors. You have more survey volatility. So at the same time, we face an enormous opportunity. There’s the explosion of big data. We have digitized operations and records and cheap powerful computers, we have internet connectivity, we have all sorts of novel software. All of these give us a brand new and really powerful source of information that just wasn’t there before.”
The solution is to blend data from multiple sources, such as surveys, government agencies, private aggregators, companies and crowdsourcing. Then organizations can combine, match and merge that data and model it using predictive analytics.
The panelists noted that collaboration, sharing data and blended data are important in this new era, because information may overlap, be related or be more beneficial when combined to better serve the public.
According to Irma Perez-Johnson, an independent consultant and social policy researcher, there should be a collaborative network of scholars, policymakers and service providers working together to “efficiently overcome roadblocks to the systemic collection, maintenance and analysis of data to support the efficacy and cost effectiveness of our social programs and policies.”
However, government also needs to take a systemic perspective to the ways that agencies use data.
“Many of our statutes and regulations sort of prevent the kind of linking and use of information from one particular program, let’s say unemployment insurance, to support the evaluation of related services, like for example, employment and training programs in a different sector,” Perez-Johnson said. “This is what I mean by bringing this systemic perspective. And as we look at the reauthorization of particular laws or pieces of legislation, or the update of regulations, we need to bring this more systemic perspective to thinking about the process by which we generate, use and then support the analysis and safeguarding of data to support these public purposes.”
Claire Bowen, principal research associate and statistical methods group lead at Urban Institute, noted how laws that are meant to protect privacy can reduce the functionality of data by reducing its shareability.
“One of the biggest barriers for us to link these datasets of people saying like, ‘I want more demographic data, I want to link that with education’ is all these laws—like there is FERPA, HIPAA, title 13, title 26subC, and so forth,” Bowen said. “All these different laws govern how we should protect different kinds of datasets, and the biggest barrier is they don’t talk to each other, or agencies have to renegotiate every fiscal year to merge some of their datasets. Or some of the things is like, ‘yeah, we can share this part of the data, but you have to destroy it at the end of the year.’ So then that becomes an issue of like, ‘well, what if we’re trying to see trends over time?’ So that becomes a huge limitation.”
According to Bowen, the public must decide what information they are comfortable sharing and understand that it is not possible to have complete privacy and data usability. She added that privacy and equity must be defined in each circumstance to navigate concerns.
The panelists also emphasized the need to focus on outcomes and to make people feel comfortable sharing their data and information. The speakers also noted the importance of data sharing agreements and returning value to the user through analysis.
“The most fundamental thing is just to continue to build incentives into the system for people to be able to appreciate evidence and use evidence more broadly,” Brian Scholl, principal economic advisor and senior economist of the Office of the Investor Advocate at the Securities and Exchange Commission, said. “I think that oftentimes happens at the leadership level within agencies, so introducing a robust culture of evidence within the agencies, but also transforming the culture broadly.”
Scholl noted that in order to serve the public good, data must be used to solve complex problems, not low-risk problems.
“If you don’t solve problems in society, then people lose trust and faith in governments,” Scholl said. “So I’d like to see a little bit more attention to creating parameters by which researchers internally can use research and data more easily. So, for example, we are able to collect non-sensitive data very, very easily, but it's sometimes a challenge for us to actually make use of that data in a way that is most beneficial to the public, which would be publicly using that data. So that kind of challenge is just an adherence to scientific integrity principles, so research doesn’t become a political football. Independent research has to be critical to what we do.”