“Build it and they will come” isn’t the way to start a service, according to data experts.
The Advisory Committee on Data for Evidence Building will meet for the first time in the coming weeks to wrestle with questions about the federal data strategy, such as whether the government can and should stand up a national secure data service.
At a Data Coalition webinar Thursday, experts said the advisory committee needs to come up with a clear outline of how a national data service would improve agency operations rather than create a warehouse that serves only as a data repository.
“What is the value proposition to the government and the people they serve?” Julia Lane, cofounder of the Coleridge Initiative, said. The Coleridge Initiative created the Administrative Data Research Facility, or ADRF, which hosts confidential data for the U.S. government and enables data sharing between agencies and states. Lane also sits on the advisory committee.
“[Sen.] Patty Murray said it really well,” she said. “We could be for more government or less government, but we should all be for better government. So enunciating what the value proposition is of combining and linking the data and protecting privacy should be the first step.”
This value proposition has to be thought out prior to ever worrying about how to create the data sharing environment, Lane said. It’s only after that purpose is defined that three key questions—how to do it, how to build capacity and how to document evidence showing its uses—become answerable.
The advisory committee has been in the works ever since the Foundations for Evidence-Based Policymaking Act was passed in 2018, which itself came out of work done by the Commission for Evidence-Based Policymaking. The committee’s web page will host live streams of all the meetings, but a date for the first meeting hasn’t been announced yet. Members of the committee were announced publicly via the web page this week.
Nick Hart, chief executive officer of the Data Coalition, is one of the members. He and Nancy Potok, former chief statistician for the U.S., recently released a proposal outlining what a national secure data service could look like. They recommended creating a Federally Funded Research Development and Center, or FFRDC, within the National Science Foundation.
Lane voiced support for the FFRDC idea at the webinar because of its potential to both create an entry point for outside researchers as well as to provide the kind of training that builds up the capacity for a robust national data service. She likened it to the Manhattan Project effort during World War II.
Sherry Glied, another expert who spoke on the Thursday member, participated in the Commission for Evidence-Based Policymaking. Glied is the former assistant secretary for planning and evaluation at the Health and Human Services Department. She said the question is how to build a useful infrastructure that leads to real action.
“The model of if you build it, they will come—it just does not work,” Glied said. “Just linking an enormous data set together is like a fever dream for academics, but it actually doesn't generate research. Research happens when people have a real question and you build the data to answer that question … a service answers a question, it doesn't just exist in the abstract.”
Building off this point, Lane added that research is not about publishing papers using this newly organized government data, but about meeting the specific needs of agencies.
“The key thing is that it's the agencies who are driving the questions,” Lane said. She added that engaging agency employees creates “viral vectors” of change, which is what will spread the use of evidence in everyday policymaking.
A successful data service is a utility which will combine three approaches, according to Lane: a technical approach, meaning how to organize the data; a human approach, which is the training component; and the value-creation approach, which means the service must be oriented toward creating useful products.