How to Go from Clinical Research to Practice in 4 Months

George Steptoe
JANUARY 11, 2019
health it optimization,health it process,health it

The progress from research to practice can be notoriously sluggish. So that’s why researchers set out to develop a new rapid evidence review process for health system leaders to more quickly and efficiently identify and prioritize novel tech-based tools to use in clinical care. The pioneering results, published in a recent study, could prove useful to doctors who want to take advantage of the proliferating technologies that can connect patients with clinical teams to improve care.

The research team looked at tools that can support patient self-management, (SM), patient decision aids, point-of-care clinical decision support and web-based platforms that keep healthcare teams and patients in touch outside of clinic visits — for instance, tools that collect patient-reported outcome (PROs).

>> READ: Our Top 8 Health-Tech Research Stories of 2018

Despite a growing enthusiasm among health system leaders that PROs and SM tools can make care safer, more effective and patient-centered, they can also be difficult to implement — owing to their complexity, the difficulty posed in figuring out how they can be best put to use in local health systems, and the varying degrees to which they’ve been tested. What’s more, formal evaluation processes of new technologies before implementation can take years.

When speaking with clinical experts, clinicians, nurses and pharmacists, the researchers came across a lot of excitement for the promising efficacy of PROs and SMs, according to Tanner Caverly, M.D., MPH, lead author of the study and assistant professor at the department of health learning sciences at internal medicine at the University of Michigan Medical School.

“But they are seeing lots and lots of work to do before being able to actually put those into practice and carry them out,” Caverly said.

How to speed up the process while maintaining a rigorous evidence-based grounding? Enter the rapid evidence review process.

The researchers set out a five-step methodology: an “environmental scan” whereby symptom domains were identified by reviewing all guidelines that had been published by major professional health organizations; expert panel recruitment; a host evidence review panel; an analysis stage; and a local validation panel.

The process is meant to streamline the information provided to an expert panel, reducing the time needed to evaluate and prioritize options from an array of interventions. Panels would meet just twice and then again in a five-hour in-person meeting. They employed a modified “Delphi” process, a method engineered to obtain a quick expert consensus.

All in all, the study found that by employing a rapid evidence review process, the time from research to practice could be reduced from several years to just 4 months.

>> LISTEN: Implementing AI Is Simpler than You Think

Caverly said that the impetus for the study was rooted in his team’s interest in the patient-centered aspect of the pay-for-performance Oncology Care Model (OCM), through which clinicians routinely send surveys to patients at hundreds of cancer care centers across the U.S. to measure patient experience and incentivizing improvements.

For a patient, the treatment pursuant to a cancer diagnosis — such as powerful chemotherapy regimen and surgeries — can be challenging enough, let alone the substantial concerns of existential angst, cancer-related fatigue and severe depression, according to Caverly.

“And it turns out that in current medical practice, we’re not always as good at paying much attention to... what the patient is experiencing internally,” Caverly said. “We may not know how severe it is.”

“By tracking any side effects that occur and using these technologies [PROs and SMs] to allow patients to communicate more effectively with the clinical teams, they can act on that and manage that more aggressively. It can radically improve the patient’s experience.”

The findings, though, came with some reservations.

According to the study, “panelists had substantial concerns about implementing patient-reported outcome tracking tools, voicing concerns about liability, lack of familiarity with new technology, and additional time and workflow changes such tools would require.”

The tech-based tools can in some ways be too effective in terms of patient feedback, Caverly said.

“Clinicians and nurses have a concern about data deluge — they feel that they don’t have a lot of time to take out a large extra task,” Caverly said. “And if all of the sudden they had a whole new stream of data coming in from patients, they don't know how they're going to be able to fit that in to their work day.”

Instead, clinicians favored technologies that did not require clinician involvement, the report concluded.

Get the best insights in healthcare analytics directly to your inbox.

Read More
Taking a Data-Driven, Multidisciplinary Approach to Radiology Research
In the US, Healthcare Data Access Is a Scavenger Hunt
Medicare Substance Abuse Data Redacted for 4 Years, Fueling Flawed Research
 

SHARE THIS SHARE THIS
9
Become a contributor