HHS’ investments in data science, people and tools are paying off in a big way.
Dr. Charles Gott had been practicing medicine for more than 30 years at the time of his arrest.
Trained as a cardiologist, Gott began treating patients with chronic pain at his Bowling Green, Kentucky, clinic in the mid-2000s, and within a few years he was prescribing more methadone than any other doctor in central Kentucky. He wasn’t stingy with fentanyl, hydrocodone, oxymorphone or other opioids, either.
Between 2006 and 2013, 41 people died within 30 days of filling a prescription from Gott. Nine deaths were related to drugs. In a pre-trial memo, Assistant U.S. Attorney Joseph Ansari said the doctor “ignored numerous red flags that indicated patients were drug seekers or addicts.”
On Feb. 5, Gott pleaded guilty to 46 federal charges, including health care fraud, unlawful distribution of a controlled substance and conspiring to distribute controlled substances that were not for a legitimate medical purpose. He recently completed the first month of an eight-year sentence in federal prison.
And if it weren’t for government data, he’d likely still be prescribing medications.
Gott is one of many unscrupulous health care providers put behind bars thanks to the data analytics experts in the Health and Human Services Department’s Office of Inspector General.
The office has been bringing lawbreakers to justice for years, but it’s recently transformed the way it uses data to uncover fraud, waste and abuse in Medicare and Medicaid by adopting sophisticated analytics tools that tap into troves of information stored across the agency.
“What data does is it creates a pyramid effect, and we can go to the top of that pyramid,” said Mike Cohen, an operations officer with the OIG’s Office of Investigations. “It helps us to stratify the way we do our cases. We’re going after the worst of the worst."
Data analysis has always factored into investigations in some respect, but the scale of those efforts ramped up dramatically in recent years, and HHS plans to use the office’s success to set the bar for open data efforts across the entire federal government.
The Hub of the Wheel
In a conversation with Nextgov, Cohen said investigators in the past had to request data from federal contractors, and those claims could take months to process. Even after officials started learning to perform their own basic analyses, most of their cases came from “random poking around” federal datasets or tips from the office hotline, he said.
Those cases often uncovered info that led to future investigations, but there was no way to see how the findings fit into the bigger picture.
“There was certainly plenty of work to do,” said Cohen, “but we were not stratifying those cases to get the biggest hits. Over the years we may work a couple hundred-thousand dollar cases, but not work an $18 million case because we didn’t know about it.”
But the days of following breadcrumbs are over.
Today, investigators in Washington and regional offices constantly analyze the Centers for Medicare and Medicaid Services’ Integrated Data Repository, which contains petabytes of data on patient claims, providers, risk scores and other topics, and build their own metrics to spot potential wrongdoers.
Among the data points OIG finds particularly useful for spotting opioid fraudsters are morphine equivalent doses, which measures the total prescription strength of opioids for individual patients, and the prevalence of “doctor shoppers,” or patients who receive prescriptions from multiple physicians. If they find a provider with abnormally high dosing levels or numerous doctor shoppers, like Gott, that kicks off the field work that could ultimately lead to criminal charges, Cohen said.
Other OIG departments use the CMS repository for their own audits and research, crunching financial data, building risk models or examining regional and demographic drug use. Those analyses are also funneled to investigators to inform their work, like “spokes on a wheel,” said Cohen. “We’re the hub.”
His office also frequently partners with federal, state and local law enforcement agencies to share data and help them flesh out their own data shops.
Still, there’s no magic metric or algorithm that identifies fraudsters. Data itself can only get you so far, and even after crunching the numbers from every possible angle, the agency still needs boots-on-the-ground investigative work to get to the bottom of cases.
“There’s a lot that goes into determining if something is irrational [or] whether it’s actually fraudulent,” said Tim Kropp, a senior adviser to OIG’s chief data officer. “The data and the analytics are always in service of a person here doing their work. It’s important to think that way...to never forget that it is that investigator, that auditor, that evaluator that’s the heart of what you’re doing.”
A Look Under the Hood
Babe Ruth couldn’t have hit a home run with a plastic bat, and even the best data gurus likely couldn’t find groundbreaking insights in data without the right tools.
Kropp told Nextgov the office is investing heavily in new technologies that allow investigators to work more efficiently and accurately, and storing all those tools in a single cloud-based analytics platform. Dubbed the “Data Basecamp,” the platform provides analysts with an ever-expanding toolkit for finding the bad apples in the barrel.
The basecamp includes a variety of mapping software, open source tools, advanced statistical programs and software as a service applications to help investigators contextualize the data. Housing the entire suite of applications in a single platform gives offices access to numerous capabilities and lets them pick and choose the tools they need without having to recreate the whole stack for each new investigation, Kropp said.
The platform also includes a number of visualization tools, which Kropp noted play a significant role in helping less tech-savvy decision makers understand analysts’ work.
“For someone to take action on the information you created…[they’ve] got to understand it at the end of the day and they’ve got to have trust in it,” he said. “We are visual human beings. Somebody could talk all day about the statistics of something, but if you’re able to pair that [finding] with a visualization somebody can understand, then you’ve made it really powerful.”
As investigators rely more heavily on analytics for their work, their improved results are starting to chip away at the cultural issues that can slow the adoption of emerging technologies, said Claire Walsh, principal consultant for data and analytics solutions at Excella Consulting. Excella is one of the vendors helping OIG stand up and expand the data basecamp, and Walsh has played a key role in shepherding the office’s data efforts from scattered clusters of Excel spreadsheets to a unified platform used by more than 1,600 employees.
One significant benefit of the platform is it’s allowed investigators to retool previous analyses for different purposes, Walsh told Nextgov. Her company and OIG are currently discussing ways to make data and analyses more searchable, perhaps by building an “Amazon for OIG data” application that would let employees peruse datasets and scripts, and drop interesting items in a virtual cart.
“One of the pain points they have is someone in [one region] might be working on an audit that’s very similar to someone in [another region] and there’s no visibility and knowledge transfer there,” she told Nextgov. “The idea is with this modern toolkit we give analysts a secure, individual workspace in the cloud where they can store files. There’s now visibility—they don’t have to create everything from scratch every time.”
Kropp said the office is also working to build more predictive tools into the system that would run a series of basic analyses on incoming data and automatically flag outliers for potential investigations related to opioids and a slew of other issues.
“Opioids is a major focus for us now because of the impact it’s having on our country,” he said, “but we have a very large oversight mission and there’s a lot of areas we’re putting data into to see how we can use our resources to the best effect.”
Data to the People
OIG’s redoubled emphasis on analytics comes as part of a broader initiative to make HHS a more data-driven organization and make the agency’s mountains of information more publicly accessible.
Over the past several years, HHS opened nearly 2,000 datasets in the hope that industry and researchers would use the information to create solutions not yet conceived by government, said HHS Chief Data Officer Mona Siddiqui. Lawmakers have also called on the agency to aggregate data on opioids and addiction in a public dashboard to help local organizations fight the growing substance abuse epidemic.
Siddiqui told Nextgov that varying regulations and compliance measures at different agency subcomponents have created data silos that limit groups’ access to information. Since assuming her role last year, she’s made it her mission to lower such barriers to entry and create an environment where organizations can more readily get the data that can improve and inform their decisions.
“It turns out that we are great at taking data and doing reporting from that data and using data for the primary purpose for which it’s collected, but we’re not using the data for secondary purposes,” she said. Her team recently completed an assessment of each component’s data-sharing policies, and it's now working to build an agencywide governance structure to make those exchanges, in her words, “more seamless, accountable and transparent.”
In December, HHS hosted a code-a-thon that brought hundreds of programmers, entrepreneurs and public health advocates together to create tech solutions to the opioid epidemic. The agency gave participants access to some 70 in-house datasets for building their tools.
Siddiqui said the event proved to be a great opportunity to show HHS leaders the value data could create when put in the right hands, which could ultimately push them to invest more in the agency’s own analytics efforts.
“We want to be able to highlight the fact that people want to do this but aren’t resourced appropriately to be able to carry out the kind of work they want to,” she said.
It will take time to work out the nuances of data governance, and the final plan would likely let groups keep some ownership over their data, but just starting the conversation around information sharing and collaboration is a big first step, said Siddiqui.
“This is really a long-term strategy for the department,” she said. “Transforming a large organization into one that uses data in how it operates and leverages data as an asset is not really a six-month or a two-year journey. It’s really about a 15- to 20-year commitment.”
“Having a data-driven and outcome-oriented approach is really where we would want any organization to be, and I think the federal government should absolutely lead the way, and I see HHS as being a leader in that.”