Rapid, responsive, relevant (R3) research: a call for a rapid learning health research enterprise
© Riley et al.; licensee Springer. 2013
Received: 2 April 2013
Accepted: 30 April 2013
Published: 10 May 2013
Our current health research enterprise is painstakingly slow and cumbersome, and its results seldom translate into practice. The slow pace of health research contributes to findings that are less relevant and potentially even obsolete. To produce more rapid, responsive, and relevant research, we propose approaches that increase relevance via greater stakeholder involvement, speed research via innovative designs, streamline review processes, and create and/or better leverage research infrastructure. Broad stakeholder input integrated throughout the research process can both increase relevance and facilitate study procedures. More flexible and rapid research designs should be considered before defaulting to the traditional two-arm randomized controlled trial (RCT), but even traditional RCTs can be designed for more rapid findings. Review processes for grant applications, IRB protocols, and manuscript submissions can be better streamlined to minimize delays. Research infrastructures such as rapid learning systems and other health information technologies can be leveraged to rapidly evaluate new and existing treatments, and alleviate the extensive recruitment delays common in traditional research. These and other approaches are feasible but require a culture shift among the research community to value not only methodological rigor, but also the pace and relevance of research.
This protracted period from concept to publication is further exacerbated by the slow and limited uptake of research findings into practice. Balas and Boren have estimated that it takes approximately 17 years from concept to evidence implementation for the 14% of evidence that progresses to implementation . A recent report from the President’s Council of Advisors on Science and Technology (PCAST) estimated that 3,000 treatments are in development and concluded that major upgrades are needed in the research system to evaluate these treatments . An Institute of Medicine report on clinical trials states that “recognition is growing that the clinical trials enterprise in the United States faces substantial challenges impeding the efficient and effective conduct of clinical research to support the development of new medicines and evaluate existing therapies .” Clearly, our current research enterprise is too slow, inefficient, and cumbersome to meet the rapidly evolving demand.
Issues in promoting rapid research by stage of research
Concept through trial preparation
Recruitment through follow-up
Analysis through publication
• Engage stakeholders via evaluability assessment to assist with design of practical trials
• Consider outcomes and measures important and relevant to stakeholders who will need to act on results
• Establish stakeholder “citizen-scientist” feedback panels; leverage networking technologies.
• Ongoing engagement with stakeholders on methods to improve recruitment and follow-up retention
• Submit preliminary findings to stakeholders for review and direction-setting
• Submit initial results to stakeholders for assistance with interpretation, relevance, dissemination and forming next study questions
• Share presentations with stakeholders at policy and practice venues
• Replace the traditional pilot with iterative N-of-1 and optimization designs
• Consider within-subject and MINC to typical comparison conditions
• Leverage technology to automate RCTs when possible
• Consider alternatives to the two-arm RCT including factorial, within subject, pragmatic, quasi-experimental, and rapid learning designs
• Report proximal outcomes while follow-up data collection continues
• Streamlined grant review process
• Encourage reviewers to consider innovative designs that speed research
• Streamline IRB approval process, especially for low risk studies
• Rapid modification approvals from IRBs
• Encourage online and open access publication
• Incentives to speed manuscript reviews
• Use of data standards and common data elements to improve research efficiency and facilitate data sharing
• Create rapid learning systems that can generate data to test multiple competing hypotheses and develop predictive models
• Create national biobank/bio-samples systems
• Use practice network registries to speed recruitment, provide enriched histories & follow-ons
• Leverage existing EHR and other rapid learning data systems to rapidly test hypotheses
• Robust policies and procedures for data sharing and merging
• Improved systems for disseminating findings to appropriate stakeholders
Relevance and stakeholder engagement
Broad stakeholder engagement involving patients, providers, health plans, policy makers and other relevant stakeholders may seem counterintuitive as a strategy to speed research, but this time investment has the potential to improve the recruitment and retention of study participants, thus increasing the pace of conducting the study. More importantly, stakeholder engagement increases the likelihood that findings will be relevant to stakeholders and more readily adopted into practice, thereby making the overall research pipeline more efficient. These “evaluability assessments”  or participatory approaches are considered key to facilitating the adoption of research findings by practitioners . The Patient Centered Outcomes Research Institute (PCORI), for example, is creating public advisory groups and soliciting patient input on specific comparative effectiveness questions that are relevant to practitioners and stakeholders .
An ongoing relationship between researchers, healthcare providers, health plans, and patients is critical to a better, faster research system. Clinical trial recruitment is a major problem with about 90% of US trials failing to meet enrollment goals . The NIH Health Care System Collaboratory (HCSC) offers an important resource for rapid research. Like the HMO Research Network and the VA QUERI program, the HCSC will make available opportunities to conduct large-scale studies within well-organized healthcare delivery systems . Research embedded in organized delivery systems and networks enhances not only research relevance, but also facilitates recruitment, retention, study start-up, operations, data capture, and integration into practice.
An accelerated research system can also use information technologies to speed the process of seeking and obtaining stakeholder feedback. Consistent with a citizen-scientist model , a virtual network of various stakeholders can participate and provide feedback throughout the research process using online surveys, virtual meetings, and social media systems . Via innovative technologies, stakeholder feedback can be obtained efficiently to increase research relevance and responsiveness, even for researchers who are not fully integrated into the practice setting or community where their research findings are likely to be translated into practice.
Rapid research designs
Traditional study designs and procedures are well-established, rigorous, and notoriously slow and costly. This belabored research process typically begins with pilot trials that we posit have limited benefit, are used inappropriately to estimate effect size , and often prematurely concretize a less than optimal intervention. Instead, we recommend replacing the traditional pilot trial with a more flexible iterative intervention testing and optimization approach, analogous to the agile software development process that places a premium on failing early to succeed later . For example, N-of-1 trial designs provide intervention development flexibility. With the increasing availability of intensive longitudinal data from wireless sensors and mobile devices, N-of-1 trials can be rapidly implemented and provide results congruent with a more personalized medicine approach , and Bayesian analyses from a series of such trials  may provide sufficient evidence of generalizability to limit the need for a larger trial.
Intervention optimization designs such as fractional factorial and sequential multiple assignment research trials (SMART) are particularly valuable when the intervention development questions involve combinations or sequences of intervention components . Dynamic system models have also been used to optimize treatments . Some optimization approaches may take more time than the traditional pilot trial, but the pace of the overall research enterprise will be improved by more quickly discarding or modifying interventions that are unlikely to be found effective in larger and more expensive trials.
Within the traditional RCT, researchers have a number of design decisions that can increase efficiency. Trials that utilize within-group designs in which participants serve as their own controls can speed the research process by reducing the number of study participants needed to detect outcomes, and can often simplify study procedures as well. The Minimal Intervention Needed for Change (MINC) standard  provides a standard pragmatic comparison anchor across studies for comparative effectiveness research. The VA is adopting a “point of care” randomization that computer-randomizes patients to different treatments, and then uses adaptive algorithms to change allocation of new patients as evidence accumulates . Recent technological advances make it possible to conduct “automated RCTs” in which the enrollment, random assignment, intervention delivery, and outcome assessments are fully automated. To fully realize the potential of automated RCTs and other rapid learning systems, the nature of and procedures for informed consent need to be resolved.
Follow-up periods also can be shortened or segmented. Results can be analyzed at the point where the maximal benefit of the intervention is hypothesized to occur. Longer-term outcomes can be modeled from these results, or one of the investigators can remain blind to conduct the follow-up portion of the study and publish the follow-up results separately.
In addition to improving the efficiency of RCTs, we also need to consider alternative designs that may be more appropriate to the research question and provide more rapid and relevant answers. A range of within-subject and quasi-experimental designs such as interrupted time series , stepped wedge , and regression discontinuity  may have less internal validity than the RCT, but offer a number of advantages . For example, these quasi-experimental approaches facilitated participation of the major Minnesota health insurers in the DIAMOND depression treatment program . These designs may be particularly appropriate for evaluating treatments already adopted in practice.
Rapid review processes
It takes 9 to 11 months from NIH grant submission to funding . If revised and resubmitted, and assuming a six month revision period, it can take two years from initial submission to the award of a revised (A1) NIH grant application. During this time, science and technology continue to advance; research partnerships, especially with non-research stakeholders, must be maintained; and the research questions may become less relevant or timely.
Grant review and funding processes could be streamlined in a number of ways. For the Recovery Act Challenge Grants , a flexible, two-stage review process was implemented to reduce to five months the time from receipt to funding. Rapid review processes are already used by the NIH for time sensitive natural experiments [29, 30]. In response to the SARS (Severe Acute Respiratory Syndrome) outbreak, the Canadian Institutes of Health Research developed and issued a funding announcement that resulted in 18 submissions within 2 weeks, and these submissions were reviewed and four approved for funding within 10 days of submission . These rapid review examples clearly indicate that it is possible to review and fund research applications quickly when necessary, and that such rapid review systems should be considered for a broader range of research, including timely and pressing clinical and public health questions.
The grant application review process could facilitate more rapid research not only by reviewing more efficiently, but also by placing a greater premium on more rapid and innovative research designs. Despite the addition of innovation as an NIH review criteria, a recent study of grant applications revealed that novelty is associated with a 4.5 percentile point drop, and that feasibility concerns did not contribute substantially to this “novelty penalty” . Innovative designs, especially those that speed the pace of research relative to traditional designs, should be rewarded, not penalized.
Institutional Review Boards (IRBs) also should consider streamlining review procedures. Slowness of research should be considered a risk, both to study participants who may continue in their assigned treatment even as newer treatments become available, and to the broader public who are delayed getting answers to relevant research questions. Revisions to the Common Rule are anticipated to allow for a more flexible and rapid review process .
Online and open access publication practices have greatly reduced the time from acceptance to publication  but could be further facilitated by a better or more incentivized process for acquiring reviewers and obtaining reviews. As Green noted, new technologies for publication, systematic reviews, and dissemination of evidence-based guidelines reduce the time from research findings to practitioner adoption, but the publication and dissemination process should continue to be reviewed to further reduce the time lapses between the various stages of the dissemination and implementation process .
Infrastructure for rapid research
Improving our research infrastructure has the potential not only to speed the pace of research, but also increase its rigor and relevance. The health system has lagged decades behind other sectors in IT implementation . As a result, health research has been severely constrained by a data-poor environment in which acquiring needed research data is expensive, difficult, and time-consuming. Since the rapid-learning health system and learning healthcare system concepts were advanced in 2007 , major investments have been made in databases and learning networks to take advantage of the research potential of electronic health records. It is now possible to conduct some studies in weeks or months instead of years. The FDA mini-Sentinel system accesses 125 million patient records to generate several studies per week on drug safety questions . Large biobanks are now coming on-line at Kaiser-Permanente,  the Veterans Health Administration , the ENCODE network, , and the UK Biobank . Using these and other patient databases, researchers have been able to assess the unintended effects of treatments  and produce outcome findings comparable to RCTs .
Large future investments are now being considered that could offer extraordinary opportunities for researchers and a faster, more efficient infrastructure for rapid learning research. The NIH Director has proposed a new national patient-oriented research system with electronic health records databases, including genomics, for 20–30 million patients , and PCORI recently released a funding announcement to support development of the National Patient Centered Clinical Research Network . The Big Data to Knowledge (BD2K) initiative  also provides the opportunity to leverage these large data sets for rapid research.
Researchers can accelerate the collective pace of learning with greater attention to reporting comparable data. There have been a number of efforts to encourage the use of common data elements . Standardized outcome data are particularly problematic for patient-reported outcomes. To address this problem, there have been consensus measurement efforts such as PhenX  as well as efforts to co-calibrate various patient-reported outcome measures on a single metric .
The National Research Council report on Precision Medicine calls for a new science commons and national learning system that will revolutionize biomedical research, clinical care, and public health . One of the benefits of such a system is that researchers can more readily target drug approval studies to predicted high-response populations, and cut years from the drug research process. Gleevec, an anti-cancer drug, was approved in a trial of only 54 patients because nearly all showed benefit . With targeted therapeutics and research, it took only four years from target discovery to drug approval for the lung cancer drug Xalkori . Rapid learning systems of large patient populations appear to provide the infrastructure to rapidly evaluate treatments.
We have outlined a number of actions to enhance relevance, streamline design, speed review, and use new research infrastructures to make research more rapid, relevant, and responsive to the 21st century demands on health research. This transformation to a rapid research learning system will require a concerted effort by research funders, academic institutions, healthcare systems, researchers, and a variety of practice, community, and policy stakeholders to produce a culture change among the health research community. Will we continue to use limited funds to support the currently slow and cumbersome research enterprise that produces costly results that may not be relevant or easily translated into practice, or are we willing to pursue alternative approaches that in other research disciplines have produced rapid and relevant improvements?
The rationale and opportunities for such a culture change have never been greater or rapid answers more needed. Our recommendations to speed research are undoubtedly incomplete, and we invite the research community to contribute additional recommendations to increase the speed and relevance of the research enterprise. This call to streamline and speed the research process is also likely to be met with skepticism, especially among those who fear that methodological rigor might be compromised in a quest for greater efficiency. We believe that the recommendations outlined in this paper can be achieved without compromising scientific rigor, and any efforts to streamline research should be judged based on methodological soundness. We are convinced, however, that the currently dominant health research paradigm is too slow and inefficient to address today’s challenges, and that we must produce a more rapid, responsive, and relevant research enterprise.
- Ioannidis JP: Effect of the statistical significance of results on time to completion and publication of randomized efficacy trials. JAMA 1998, 279: 281–286. 10.1001/jama.279.4.281View ArticlePubMedGoogle Scholar
- Nilsen W, Kumar S, Shar A, Varoquiers C, Wiley T, Riley WT, Pavel M, Atienza AA: Advancing the science of mHealth. J Health Communication 2012, 17: 5–10.View ArticlePubMedGoogle Scholar
- Chimowitz MI, Lynn MJ, Derderyn CP, Turan TN, Fiorella D, for the SAMMPRIS Trial Investigators: Stenting versus aggressive medical therapy for intracranial arterial stenosis. NEJM 2011, 265: 993–1003.View ArticleGoogle Scholar
- Qureshi AI: Interpretations and implications of the prematurely terminated stenting and aggressive medical management for preventing recurrent stroke in the intracranial stenosis (SAMMPRIS) trial. Neurosurg 2012, 70: E264-E268. 10.1227/NEU.0b013e318239f318View ArticleGoogle Scholar
- Balas EA, Boren SA: Managing clinical knowledge for health care improvement: Yearbook of medical informatics. Stuttgart: Schattauer; 2000.Google Scholar
- President’s Council of Advisors on Science and Technology (PCAST) Report to the President on Propelling Innovation In Drug Discovery, Development, and Evaluation. http://www.whitehouse.gov/sites/default/files/microsites/ostp/pcast-fda-final.pdf
- Weisfeld N, English RA, Claiborne AB: Envisioning a Transformed Clinical Trials Enterprise in the United States. Washington, DC: Institute of Medicine, National Academies Press; 2012.Google Scholar
- Leviton LC, Khan LK, Rog D, Dawkins N, Cotton D: Evaluability assessment to improve public health policies, programs, and practices. Ann Rev Public Health 2010, 31: 213–233. 10.1146/annurev.publhealth.012809.103625View ArticleGoogle Scholar
- Green LW: Making research relevant: if it is an evidence-based practice, where’s the practice-based evidence? Fam Pract 2008, 25: i20-i24. 10.1093/fampra/cmn055View ArticlePubMedGoogle Scholar
- Patient-Centered Outcomes Research Institute. http://www.pcori.org/
- NIH HCS Collaboratory. http://commonfund.nih.gov/hcscollaboratory/overview.aspx
- Silverton J: A new dawn for citizen science. Trends Ecol Evol 2009, 24: 467–471. 10.1016/j.tree.2009.03.017View ArticleGoogle Scholar
- Yao JT, Liu WN: Web-based dynamic Delphi: a new survey instrument. http://www2.cs.uregina.ca/~jtyao/Papers/Delphi_Cam.pdf
- Leon AC, Davis LL, Kraemer HC: The role and interpretation of pilot studies in clinical research. J Psychiatr Res 2011, 45: 626–629. 10.1016/j.jpsychires.2010.10.008PubMed CentralView ArticlePubMedGoogle Scholar
- Gary K, Enquobahrie A, Ibanez L, Cheng P, Yaniv Z: Agile methods for open source safety-critical software. Softw Pract Exp 2011, 41: 945–962. 10.1002/spe.1075PubMed CentralView ArticlePubMedGoogle Scholar
- Lillie EO, Patay B, Diamant J, Issell B, Topol EJ, Schork NJ: The N-of-1 clinical trial: the ultimate strategy for individualizing medicine? Per Med 2011, 8: 161–173. 10.2217/pme.11.7PubMed CentralView ArticlePubMedGoogle Scholar
- Zucker DR, Ruthazer R, Schmid CH: Individual (N-of-1) trials can be combined to give population comparative treatment effect estimates: methodologic considerations. J Clin Epidemiology 2010, 63: 1312–1323. 10.1016/j.jclinepi.2010.04.020View ArticleGoogle Scholar
- Collins LM, Murphy SA, Stretcher V: The multiphase optimization strategy (MOST) and the sequential multiple assignment randomized trial (SMART): new methods for more potent e-health interventions. Am J Prev Med 2007, 32: S112-S118. 10.1016/j.amepre.2007.01.022PubMed CentralView ArticlePubMedGoogle Scholar
- Rivera DE, Pew MD, Collins LM: Using engineering control principles to inform the design of adaptive interventions: a conceptual introduction. Drug Alcohol Depend 2007, 88: S31-S40.PubMed CentralView ArticlePubMedGoogle Scholar
- Gierisch JM, DeFrank JT, Bowling JM, Rimer BK, Matuszewski JM: Finding the minimal intervention needed for sustained mammography adherence. Am J Prev Med 2010, 39: 334–344. 10.1016/j.amepre.2010.05.020PubMed CentralView ArticlePubMedGoogle Scholar
- Fiore LD, Brophy M, Ferguson RE, D’Avolio L, Hermos JA: A point-of-care clinical trial comparing insulin administered using a sliding scale versus a weight-based regimen. Clin Trials 2011, 8: 183–195. 10.1177/1740774511398368PubMed CentralView ArticlePubMedGoogle Scholar
- Linden A, Adams JL: Applying a propensity score-based weighting model to interrupted time series data: Improving causal inference in programme evaluation. J Eval Clin Pract 2011, 17: 1231–1238. 10.1111/j.1365-2753.2010.01504.xView ArticlePubMedGoogle Scholar
- Brown CA, Lilford RJ: The stepped wedge trial design: a systematic review. BMC Med Res Methodol 2006, 6: 54–62. 10.1186/1471-2288-6-54PubMed CentralView ArticlePubMedGoogle Scholar
- Shadish WR, Galindo R, Wong VC, Steiner PM, Cook TD: A randomized experiment comparing random and cut-off based assignment. Psychol Methods 2011, 16: 179–191.View ArticlePubMedGoogle Scholar
- Glasgow RE, Magid DJ, Beck A, Ritzwoller D, Estabrooks PA: Practical clinical trials for translating research to practice: design and measurement recommendations. Medical Care 2005, 43: 551–557. 10.1097/01.mlr.0000163645.41407.09View ArticlePubMedGoogle Scholar
- Solberg LI, Glasgow RE, Unutzer J, Jaeckels N, Oftedahl G: Partnership research: a practical trial design for evaluation of a natural experiment to improve depression care. Med Care 2010, 48: 576–582. 10.1097/MLR.0b013e3181dbea62PubMed CentralView ArticlePubMedGoogle Scholar
- National Institutes of Health Office of Extramural Research, Grants Review Process. http://grants.nih.gov/grants/grants_process.htm
- Recovery Act Limited Competition: NIH Challenge Grants in Health and Science Research (RC1). http://grants.nih.gov/grants/guide/rfa-files/RFA-OD-09–003.html
- Time-Sensitive Obesity Policy and Program Evaluation (R01). http://grants.nih.gov/grants/guide/pa-files/PAR-12–257.html
- Rapid Assessment Post-Impact of Disaster (R21). http://grants.nih.gov/grants/guide/pa-files/PAR-12–181.html
- Singh B: Innovation and challenges in funding rapid research responses to emerging infectious diseases: lessons learned from the outbreak of severe acute respiratory syndrome. Infect Dis Med Microbiol 2004, 15: 167–170.Google Scholar
- Boudreau KJ, Guinan EC, Lakhani KR, Riedl C: The novelty paradox and bias for normal science: evidence from randomized medical grant proposal evaluations. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2184791
- Human Subjects Research Protections: Enhancing Protections for Research Subjects and Reducing Burden, Delay, and Ambiguity for Investigators. http://www.gpo.gov/fdsys/pkg/FR-2011–07–26/html/2011–18792.htm
- Gupta KC: What does the marriage of Open Access and online publication bring? AIDS Res Ther 2004, 14: 1.View ArticleGoogle Scholar
- Bender MW, Mitwalli AH, Van Kuiken SJ: What’s holding back online medical data. McKinsey Quarterly; http://www.mckinseyquarterly.com
- Etheredge LM: A rapid-learning health system. Heal Aff 2007, 26: W107-W118. 10.1377/hlthaff.26.2.w107View ArticleGoogle Scholar
- Moores K, Gilchrist B, Carnahan R, Abrams T: A systematic review of validated methods for identifying pancreatitis using administrative data. Pharmacoepidemiol Drug Saf 2012, 21(Suppl 1):194–202.View ArticlePubMedGoogle Scholar
- Kaiser Permanente Research Program on Genes, Environment, and Health (RPGEH). http://www.dor.kaiser.org/external/DORExternal/rpgeh/index.aspx
- VA Specimen Research and Biobanking Program. http://www.research.va.gov/programs/specimen_biobanking.cfm
- The ENCODE Project: ENCyclopedia Of DNA Elements. http://www.genome.gov/10005107
- UK Biobank. http://www.ukbiobank.ac.uk/about-biobank-uk/
- Hippisley-Cox J, Coupland C: Unintended effects of statins in men and women in England and Wales: population-based cohort study using the QResearch database. BMJ 2010, 340: c2197. 10.1136/bmj.c2197PubMed CentralView ArticlePubMedGoogle Scholar
- Tannen RL, Weiner MG, Xie D: Use of primary care electronic medical record database in drug efficacy research on cardiovascular outcomes: comparison of database and randomized controlled trial findings. BMJ 2009, 338: b81. 10.1136/bmj.b81View ArticlePubMedGoogle Scholar
- Collins F: A Vision for a National Patient-Centered Research Network. http://www.pcori.org/events/national-workshop-to-advance-use-of-electronic-data/?type=past
- NIH Undertakes “Big Data” Initiatives. https://www.aamc.org/advocacy/washhigh/highlights2012/323668/121412nihundertakesbigdatainitiatives.html
- NIH Common Data Element Research Portal. http://www.nlm.nih.gov/cde/index.html
- Hamilton CM, Strader LC, Pratt JG, Maiese D, Hendershot T: The PhenX Toolkit: get the most from your measures. Am J Epidemiol 2011, 174: 253–260. 10.1093/aje/kwr193PubMed CentralView ArticlePubMedGoogle Scholar
- Noonan VK, Cook KF, Bamer AM, Choi SW, Kim J, Amtmann D: Measuring fatigue in persons with multiple sclerosis: creating a crosswalk between the Modified Fatigue Impact Scale and the PROMIS Fatigue Short Form. Qual Life Res 2012, 21: 1123–1133. 10.1007/s11136-011-0040-3View ArticlePubMedGoogle Scholar
- National Research Council: Toward Precision Medicine: Building a New Knowledge Network for Biomedical Research and a New Taxonomy of Disease. Washington DC: National Academies Press; 2011.Google Scholar
- Sollano JA: Delivering Value Through Personalized Medicine: An Industry Perspective. http://healthforum.brandeis.edu/meetings/conference-pages/2012–24-july.html