recommended reading

Health professionals call for network to uncover best medical treatments

Advocates of a research method to uncover optimal medical treatments are urging policymakers to build a sophisticated network that would look for patterns in clinical trial studies and existing health research to identify those interventions.

If the federal government doesn't begin to build the network to support comparative effectiveness research on medical information collected by agencies, including data from the National Institutes of Health and the Centers for Medicare and Medicaid Services, the nation will fall behind in this area, said Ellen Sigal, chairwoman and founder of the nonprofit Friends of Cancer Research.

"We know this is going to be challenging, and we know there are enormous gaps," she said.

Proponents of comparative effectiveness research -- an element of President Obama's broader health care agenda -- say it can improve outcomes and cut health costs by providing doctors and patients with evidence of which medical interventions work best for certain individuals.

Sigal and other health care specialists debated on Tuesday strategies to improve comparative effectiveness research methods during a forum hosted by the Brookings Institution, a Washington think tank. The 2009 American Recovery and Reinvestment Act provided the Health and Human Services Department with $1.1 billion to compare the efficacy of treatments. Brookings convened the talk in response to concerns that the comparative approach might not improve quality and cut costs if it is developed improperly.

While there is wide disagreement about the kinds of studies produce the most reliable evidence, the common denominator among the various methods is a sophisticated information technology system, several participants said.

"We need a much more robust system. Our databases need to get better. Our electronic medical records need to get better. And these things take time," said Robert M. Califf, vice chancellor for clinical research and a cardiology professor at Duke University. "To me, the most important thing that's missing right now is a fundamental data infrastructure."

Sebastian Schneeweiss, an epidemiologist at Harvard Medical School, suggested blending longitudinal data collected by Medicare, the government-administered insurance program for people age 65 and older, with ongoing clinically detailed studies to create a data backbone.

Randy Burkholder, associate vice president of pharmaceutical industry group PhRMA, said in a separate interview with Nextgov that data mining and electronic medical records could be valuable tools for conducting comparative research, but the methods must be refined to avoid dismissing useful treatments.

"Moving from controlled clinical trials to mining data sets, you are introducing unknown variables," he said. "It's important to make sure you have valid results. . . . The infrastructure and the methods both need to be developed. I put those as equally valuable."

The public is calling for more information on how the government picks winners and losers among proposed treatments. An Internet user recently commented on an FDA blog that the agency should issue advertising agencies a request for proposals for how to better explain the agency's decision-making when it comes to drug approval.

"Current capabilities for patients to self-report adverse events are so constrained by medical, legal and political forces that they are effectively nonexistent," said the commenter on the blog, which is maintained by a task force that FDA established to become more transparent. "Approach should . . . include special attention to Clinical Trials Registry and Comparative Effectiveness study results."

Threatwatch Alert

Thousands of cyber attacks occur each day

See the latest threats


Close [ x ] More from Nextgov

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Featured Content from RSA Conference: Dissed by NIST

    Learn more about the latest draft of the U.S. National Institute of Standards and Technology guidance document on authentication and lifecycle management.

  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.


When you download a report, your information may be shared with the underwriters of that document.