Skip to main content
Erschienen in: Clinical and Translational Medicine 1/2013

Open Access 01.12.2013 | Perspective

Rapid, responsive, relevant (R3) research: a call for a rapid learning health research enterprise

verfasst von: William T Riley, Russell E Glasgow, Lynn Etheredge, Amy P Abernethy

Erschienen in: Clinical and Translational Medicine | Ausgabe 1/2013

Abstract

Our current health research enterprise is painstakingly slow and cumbersome, and its results seldom translate into practice. The slow pace of health research contributes to findings that are less relevant and potentially even obsolete. To produce more rapid, responsive, and relevant research, we propose approaches that increase relevance via greater stakeholder involvement, speed research via innovative designs, streamline review processes, and create and/or better leverage research infrastructure. Broad stakeholder input integrated throughout the research process can both increase relevance and facilitate study procedures. More flexible and rapid research designs should be considered before defaulting to the traditional two-arm randomized controlled trial (RCT), but even traditional RCTs can be designed for more rapid findings. Review processes for grant applications, IRB protocols, and manuscript submissions can be better streamlined to minimize delays. Research infrastructures such as rapid learning systems and other health information technologies can be leveraged to rapidly evaluate new and existing treatments, and alleviate the extensive recruitment delays common in traditional research. These and other approaches are feasible but require a culture shift among the research community to value not only methodological rigor, but also the pace and relevance of research.
Hinweise

Electronic supplementary material

The online version of this article (doi:10.​1186/​2001-1326-2-10) contains supplementary material, which is available to authorized users.

Competing interests

The authors declare that they have no competing interests. Preparation of this manuscript was supported in part by a grant from The Robert Wood Johnson Foundation to George Washington University to the third author. The funders had no role in this manuscript. The opinions expressed are those of the authors and do not necessarily reflect those of the Robert Wood Johnson Foundation or the National Cancer Institute.

Authors’ contributions

WTR led the writing and preparation of this paper. REG, LE, and APA all contributed sections to this paper, and revised and edited drafts of the paper. All authors read and approved the final manuscript.

Background

Despite increasing demands to produce timely and relevant research findings, our traditional research process remains painstakingly slow. Randomized efficacy trials take approximately 5.5 years from the initiation of enrollment to publication [1], and 7 years or longer after adding the time from grant application submission to enrollment initiation. Extensive follow-up periods for relevant outcomes such as morbidity/mortality as well as delays in participant recruitment and publication can extend this time period to a decade or longer. During this period, scientific and technological advances will occur that may make the eventual findings less relevant or even obsolete. For illustration, Figure 1 shows a few of the salient consumer technologies introduced during a typical seven year clinical trial. These recent advances in consumer technologies are most impactful for mobile and wireless health research, [2] but many less mainstream scientific and medical technological advances also occur while clinical trials are being conducted. For example, one explanation for the recently reported negative results of the SAMMPRIS trial was that stent technologies and surgery procedures had advanced substantially since study initiation [3, 4].
This protracted period from concept to publication is further exacerbated by the slow and limited uptake of research findings into practice. Balas and Boren have estimated that it takes approximately 17 years from concept to evidence implementation for the 14% of evidence that progresses to implementation [5]. A recent report from the President’s Council of Advisors on Science and Technology (PCAST) estimated that 3,000 treatments are in development and concluded that major upgrades are needed in the research system to evaluate these treatments [6]. An Institute of Medicine report on clinical trials states that “recognition is growing that the clinical trials enterprise in the United States faces substantial challenges impeding the efficient and effective conduct of clinical research to support the development of new medicines and evaluate existing therapies [7].” Clearly, our current research enterprise is too slow, inefficient, and cumbersome to meet the rapidly evolving demand.
What are needed are “rapid-learning research systems” that integrate researchers, funders, health systems, practitioners, and community partners asking clinically relevant questions, using efficient and innovative research designs, and leveraging rich, longitudinal data sets from millions of patients. To begin progress toward such a system, we considered and have described approaches to make research more rapid, responsive and relevant (R3), organized into four sections for the purpose of this paper: 1) stakeholder engagement, 2) design, 3) review, and 4) infrastructure (see Table 1). These suggested approaches are viewed as a starting point for a dialogue among the health research community to challenge our current cumbersome research enterprise and to consider these or other approaches to maintain scientific rigor while speeding the process by which more responsive and relevant research findings are produced.
Table 1
Issues in promoting rapid research by stage of research
 
Concept through trial preparation
Recruitment through follow-up
Analysis through publication
Stakeholder Relevance
• Engage stakeholders via evaluability assessment to assist with design of practical trials
• Consider outcomes and measures important and relevant to stakeholders who will need to act on results
• Establish stakeholder “citizen-scientist” feedback panels; leverage networking technologies.
• Ongoing engagement with stakeholders on methods to improve recruitment and follow-up retention
• Submit preliminary findings to stakeholders for review and direction-setting
• Submit initial results to stakeholders for assistance with interpretation, relevance, dissemination and forming next study questions
• Share presentations with stakeholders at policy and practice venues
Design Issues
• Replace the traditional pilot with iterative N-of-1 and optimization designs
• Consider within-subject and MINC to typical comparison conditions
• Leverage technology to automate RCTs when possible
• Consider alternatives to the two-arm RCT including factorial, within subject, pragmatic, quasi-experimental, and rapid learning designs
• Report proximal outcomes while follow-up data collection continues
Review Issues
• Streamlined grant review process
• Encourage reviewers to consider innovative designs that speed research
• Streamline IRB approval process, especially for low risk studies
• Rapid modification approvals from IRBs
• Encourage online and open access publication
• Incentives to speed manuscript reviews
Infrastructure Issues
• Use of data standards and common data elements to improve research efficiency and facilitate data sharing
• Create rapid learning systems that can generate data to test multiple competing hypotheses and develop predictive models
• Create national biobank/bio-samples systems
• Use practice network registries to speed recruitment, provide enriched histories & follow-ons
• Leverage existing EHR and other rapid learning data systems to rapidly test hypotheses
• Robust policies and procedures for data sharing and merging
• Improved systems for disseminating findings to appropriate stakeholders

Relevance and stakeholder engagement

Broad stakeholder engagement involving patients, providers, health plans, policy makers and other relevant stakeholders may seem counterintuitive as a strategy to speed research, but this time investment has the potential to improve the recruitment and retention of study participants, thus increasing the pace of conducting the study. More importantly, stakeholder engagement increases the likelihood that findings will be relevant to stakeholders and more readily adopted into practice, thereby making the overall research pipeline more efficient. These “evaluability assessments” [8] or participatory approaches are considered key to facilitating the adoption of research findings by practitioners [9]. The Patient Centered Outcomes Research Institute (PCORI), for example, is creating public advisory groups and soliciting patient input on specific comparative effectiveness questions that are relevant to practitioners and stakeholders [10].
An ongoing relationship between researchers, healthcare providers, health plans, and patients is critical to a better, faster research system. Clinical trial recruitment is a major problem with about 90% of US trials failing to meet enrollment goals [7]. The NIH Health Care System Collaboratory (HCSC) offers an important resource for rapid research. Like the HMO Research Network and the VA QUERI program, the HCSC will make available opportunities to conduct large-scale studies within well-organized healthcare delivery systems [11]. Research embedded in organized delivery systems and networks enhances not only research relevance, but also facilitates recruitment, retention, study start-up, operations, data capture, and integration into practice.
An accelerated research system can also use information technologies to speed the process of seeking and obtaining stakeholder feedback. Consistent with a citizen-scientist model [12], a virtual network of various stakeholders can participate and provide feedback throughout the research process using online surveys, virtual meetings, and social media systems [13]. Via innovative technologies, stakeholder feedback can be obtained efficiently to increase research relevance and responsiveness, even for researchers who are not fully integrated into the practice setting or community where their research findings are likely to be translated into practice.

Rapid research designs

Traditional study designs and procedures are well-established, rigorous, and notoriously slow and costly. This belabored research process typically begins with pilot trials that we posit have limited benefit, are used inappropriately to estimate effect size [14], and often prematurely concretize a less than optimal intervention. Instead, we recommend replacing the traditional pilot trial with a more flexible iterative intervention testing and optimization approach, analogous to the agile software development process that places a premium on failing early to succeed later [15]. For example, N-of-1 trial designs provide intervention development flexibility. With the increasing availability of intensive longitudinal data from wireless sensors and mobile devices, N-of-1 trials can be rapidly implemented and provide results congruent with a more personalized medicine approach [16], and Bayesian analyses from a series of such trials [17] may provide sufficient evidence of generalizability to limit the need for a larger trial.
Intervention optimization designs such as fractional factorial and sequential multiple assignment research trials (SMART) are particularly valuable when the intervention development questions involve combinations or sequences of intervention components [18]. Dynamic system models have also been used to optimize treatments [19]. Some optimization approaches may take more time than the traditional pilot trial, but the pace of the overall research enterprise will be improved by more quickly discarding or modifying interventions that are unlikely to be found effective in larger and more expensive trials.
Within the traditional RCT, researchers have a number of design decisions that can increase efficiency. Trials that utilize within-group designs in which participants serve as their own controls can speed the research process by reducing the number of study participants needed to detect outcomes, and can often simplify study procedures as well. The Minimal Intervention Needed for Change (MINC) standard [20] provides a standard pragmatic comparison anchor across studies for comparative effectiveness research. The VA is adopting a “point of care” randomization that computer-randomizes patients to different treatments, and then uses adaptive algorithms to change allocation of new patients as evidence accumulates [21]. Recent technological advances make it possible to conduct “automated RCTs” in which the enrollment, random assignment, intervention delivery, and outcome assessments are fully automated. To fully realize the potential of automated RCTs and other rapid learning systems, the nature of and procedures for informed consent need to be resolved.
Follow-up periods also can be shortened or segmented. Results can be analyzed at the point where the maximal benefit of the intervention is hypothesized to occur. Longer-term outcomes can be modeled from these results, or one of the investigators can remain blind to conduct the follow-up portion of the study and publish the follow-up results separately.
In addition to improving the efficiency of RCTs, we also need to consider alternative designs that may be more appropriate to the research question and provide more rapid and relevant answers. A range of within-subject and quasi-experimental designs such as interrupted time series [22], stepped wedge [23], and regression discontinuity [24] may have less internal validity than the RCT, but offer a number of advantages [25]. For example, these quasi-experimental approaches facilitated participation of the major Minnesota health insurers in the DIAMOND depression treatment program [26]. These designs may be particularly appropriate for evaluating treatments already adopted in practice.

Rapid review processes

It takes 9 to 11 months from NIH grant submission to funding [27]. If revised and resubmitted, and assuming a six month revision period, it can take two years from initial submission to the award of a revised (A1) NIH grant application. During this time, science and technology continue to advance; research partnerships, especially with non-research stakeholders, must be maintained; and the research questions may become less relevant or timely.
Grant review and funding processes could be streamlined in a number of ways. For the Recovery Act Challenge Grants [28], a flexible, two-stage review process was implemented to reduce to five months the time from receipt to funding. Rapid review processes are already used by the NIH for time sensitive natural experiments [29, 30]. In response to the SARS (Severe Acute Respiratory Syndrome) outbreak, the Canadian Institutes of Health Research developed and issued a funding announcement that resulted in 18 submissions within 2 weeks, and these submissions were reviewed and four approved for funding within 10 days of submission [31]. These rapid review examples clearly indicate that it is possible to review and fund research applications quickly when necessary, and that such rapid review systems should be considered for a broader range of research, including timely and pressing clinical and public health questions.
The grant application review process could facilitate more rapid research not only by reviewing more efficiently, but also by placing a greater premium on more rapid and innovative research designs. Despite the addition of innovation as an NIH review criteria, a recent study of grant applications revealed that novelty is associated with a 4.5 percentile point drop, and that feasibility concerns did not contribute substantially to this “novelty penalty” [32]. Innovative designs, especially those that speed the pace of research relative to traditional designs, should be rewarded, not penalized.
Institutional Review Boards (IRBs) also should consider streamlining review procedures. Slowness of research should be considered a risk, both to study participants who may continue in their assigned treatment even as newer treatments become available, and to the broader public who are delayed getting answers to relevant research questions. Revisions to the Common Rule are anticipated to allow for a more flexible and rapid review process [33].
Online and open access publication practices have greatly reduced the time from acceptance to publication [34] but could be further facilitated by a better or more incentivized process for acquiring reviewers and obtaining reviews. As Green noted, new technologies for publication, systematic reviews, and dissemination of evidence-based guidelines reduce the time from research findings to practitioner adoption, but the publication and dissemination process should continue to be reviewed to further reduce the time lapses between the various stages of the dissemination and implementation process [9].

Infrastructure for rapid research

Improving our research infrastructure has the potential not only to speed the pace of research, but also increase its rigor and relevance. The health system has lagged decades behind other sectors in IT implementation [35]. As a result, health research has been severely constrained by a data-poor environment in which acquiring needed research data is expensive, difficult, and time-consuming. Since the rapid-learning health system and learning healthcare system concepts were advanced in 2007 [36], major investments have been made in databases and learning networks to take advantage of the research potential of electronic health records. It is now possible to conduct some studies in weeks or months instead of years. The FDA mini-Sentinel system accesses 125 million patient records to generate several studies per week on drug safety questions [37]. Large biobanks are now coming on-line at Kaiser-Permanente, [38] the Veterans Health Administration [39], the ENCODE network, [40], and the UK Biobank [41]. Using these and other patient databases, researchers have been able to assess the unintended effects of treatments [42] and produce outcome findings comparable to RCTs [43].
Large future investments are now being considered that could offer extraordinary opportunities for researchers and a faster, more efficient infrastructure for rapid learning research. The NIH Director has proposed a new national patient-oriented research system with electronic health records databases, including genomics, for 20–30 million patients [44], and PCORI recently released a funding announcement to support development of the National Patient Centered Clinical Research Network [10]. The Big Data to Knowledge (BD2K) initiative [45] also provides the opportunity to leverage these large data sets for rapid research.
Researchers can accelerate the collective pace of learning with greater attention to reporting comparable data. There have been a number of efforts to encourage the use of common data elements [46]. Standardized outcome data are particularly problematic for patient-reported outcomes. To address this problem, there have been consensus measurement efforts such as PhenX [47] as well as efforts to co-calibrate various patient-reported outcome measures on a single metric [48].
The National Research Council report on Precision Medicine calls for a new science commons and national learning system that will revolutionize biomedical research, clinical care, and public health [49]. One of the benefits of such a system is that researchers can more readily target drug approval studies to predicted high-response populations, and cut years from the drug research process. Gleevec, an anti-cancer drug, was approved in a trial of only 54 patients because nearly all showed benefit [6]. With targeted therapeutics and research, it took only four years from target discovery to drug approval for the lung cancer drug Xalkori [50]. Rapid learning systems of large patient populations appear to provide the infrastructure to rapidly evaluate treatments.

Conclusions

We have outlined a number of actions to enhance relevance, streamline design, speed review, and use new research infrastructures to make research more rapid, relevant, and responsive to the 21st century demands on health research. This transformation to a rapid research learning system will require a concerted effort by research funders, academic institutions, healthcare systems, researchers, and a variety of practice, community, and policy stakeholders to produce a culture change among the health research community. Will we continue to use limited funds to support the currently slow and cumbersome research enterprise that produces costly results that may not be relevant or easily translated into practice, or are we willing to pursue alternative approaches that in other research disciplines have produced rapid and relevant improvements?
The rationale and opportunities for such a culture change have never been greater or rapid answers more needed. Our recommendations to speed research are undoubtedly incomplete, and we invite the research community to contribute additional recommendations to increase the speed and relevance of the research enterprise. This call to streamline and speed the research process is also likely to be met with skepticism, especially among those who fear that methodological rigor might be compromised in a quest for greater efficiency. We believe that the recommendations outlined in this paper can be achieved without compromising scientific rigor, and any efforts to streamline research should be judged based on methodological soundness. We are convinced, however, that the currently dominant health research paradigm is too slow and inefficient to address today’s challenges, and that we must produce a more rapid, responsive, and relevant research enterprise.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://​creativecommons.​org/​licenses/​by/​2.​0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Competing interests

The authors declare that they have no competing interests. Preparation of this manuscript was supported in part by a grant from The Robert Wood Johnson Foundation to George Washington University to the third author. The funders had no role in this manuscript. The opinions expressed are those of the authors and do not necessarily reflect those of the Robert Wood Johnson Foundation or the National Cancer Institute.

Authors’ contributions

WTR led the writing and preparation of this paper. REG, LE, and APA all contributed sections to this paper, and revised and edited drafts of the paper. All authors read and approved the final manuscript.
Anhänge

Authors’ original submitted files for images

Below are the links to the authors’ original submitted files for images.
Literatur
1.
Zurück zum Zitat Ioannidis JP: Effect of the statistical significance of results on time to completion and publication of randomized efficacy trials. JAMA 1998, 279: 281–286. 10.1001/jama.279.4.281CrossRefPubMed Ioannidis JP: Effect of the statistical significance of results on time to completion and publication of randomized efficacy trials. JAMA 1998, 279: 281–286. 10.1001/jama.279.4.281CrossRefPubMed
2.
Zurück zum Zitat Nilsen W, Kumar S, Shar A, Varoquiers C, Wiley T, Riley WT, Pavel M, Atienza AA: Advancing the science of mHealth. J Health Communication 2012, 17: 5–10.CrossRefPubMed Nilsen W, Kumar S, Shar A, Varoquiers C, Wiley T, Riley WT, Pavel M, Atienza AA: Advancing the science of mHealth. J Health Communication 2012, 17: 5–10.CrossRefPubMed
3.
Zurück zum Zitat Chimowitz MI, Lynn MJ, Derderyn CP, Turan TN, Fiorella D, for the SAMMPRIS Trial Investigators: Stenting versus aggressive medical therapy for intracranial arterial stenosis. NEJM 2011, 265: 993–1003.CrossRef Chimowitz MI, Lynn MJ, Derderyn CP, Turan TN, Fiorella D, for the SAMMPRIS Trial Investigators: Stenting versus aggressive medical therapy for intracranial arterial stenosis. NEJM 2011, 265: 993–1003.CrossRef
4.
Zurück zum Zitat Qureshi AI: Interpretations and implications of the prematurely terminated stenting and aggressive medical management for preventing recurrent stroke in the intracranial stenosis (SAMMPRIS) trial. Neurosurg 2012, 70: E264-E268. 10.1227/NEU.0b013e318239f318CrossRef Qureshi AI: Interpretations and implications of the prematurely terminated stenting and aggressive medical management for preventing recurrent stroke in the intracranial stenosis (SAMMPRIS) trial. Neurosurg 2012, 70: E264-E268. 10.1227/NEU.0b013e318239f318CrossRef
5.
Zurück zum Zitat Balas EA, Boren SA: Managing clinical knowledge for health care improvement: Yearbook of medical informatics. Stuttgart: Schattauer; 2000. Balas EA, Boren SA: Managing clinical knowledge for health care improvement: Yearbook of medical informatics. Stuttgart: Schattauer; 2000.
7.
Zurück zum Zitat Weisfeld N, English RA, Claiborne AB: Envisioning a Transformed Clinical Trials Enterprise in the United States. Washington, DC: Institute of Medicine, National Academies Press; 2012. Weisfeld N, English RA, Claiborne AB: Envisioning a Transformed Clinical Trials Enterprise in the United States. Washington, DC: Institute of Medicine, National Academies Press; 2012.
8.
Zurück zum Zitat Leviton LC, Khan LK, Rog D, Dawkins N, Cotton D: Evaluability assessment to improve public health policies, programs, and practices. Ann Rev Public Health 2010, 31: 213–233. 10.1146/annurev.publhealth.012809.103625CrossRef Leviton LC, Khan LK, Rog D, Dawkins N, Cotton D: Evaluability assessment to improve public health policies, programs, and practices. Ann Rev Public Health 2010, 31: 213–233. 10.1146/annurev.publhealth.012809.103625CrossRef
9.
Zurück zum Zitat Green LW: Making research relevant: if it is an evidence-based practice, where’s the practice-based evidence? Fam Pract 2008, 25: i20-i24. 10.1093/fampra/cmn055CrossRefPubMed Green LW: Making research relevant: if it is an evidence-based practice, where’s the practice-based evidence? Fam Pract 2008, 25: i20-i24. 10.1093/fampra/cmn055CrossRefPubMed
12.
Zurück zum Zitat Silverton J: A new dawn for citizen science. Trends Ecol Evol 2009, 24: 467–471. 10.1016/j.tree.2009.03.017CrossRef Silverton J: A new dawn for citizen science. Trends Ecol Evol 2009, 24: 467–471. 10.1016/j.tree.2009.03.017CrossRef
14.
Zurück zum Zitat Leon AC, Davis LL, Kraemer HC: The role and interpretation of pilot studies in clinical research. J Psychiatr Res 2011, 45: 626–629. 10.1016/j.jpsychires.2010.10.008PubMedCentralCrossRefPubMed Leon AC, Davis LL, Kraemer HC: The role and interpretation of pilot studies in clinical research. J Psychiatr Res 2011, 45: 626–629. 10.1016/j.jpsychires.2010.10.008PubMedCentralCrossRefPubMed
15.
Zurück zum Zitat Gary K, Enquobahrie A, Ibanez L, Cheng P, Yaniv Z: Agile methods for open source safety-critical software. Softw Pract Exp 2011, 41: 945–962. 10.1002/spe.1075PubMedCentralCrossRefPubMed Gary K, Enquobahrie A, Ibanez L, Cheng P, Yaniv Z: Agile methods for open source safety-critical software. Softw Pract Exp 2011, 41: 945–962. 10.1002/spe.1075PubMedCentralCrossRefPubMed
16.
Zurück zum Zitat Lillie EO, Patay B, Diamant J, Issell B, Topol EJ, Schork NJ: The N-of-1 clinical trial: the ultimate strategy for individualizing medicine? Per Med 2011, 8: 161–173. 10.2217/pme.11.7PubMedCentralCrossRefPubMed Lillie EO, Patay B, Diamant J, Issell B, Topol EJ, Schork NJ: The N-of-1 clinical trial: the ultimate strategy for individualizing medicine? Per Med 2011, 8: 161–173. 10.2217/pme.11.7PubMedCentralCrossRefPubMed
17.
Zurück zum Zitat Zucker DR, Ruthazer R, Schmid CH: Individual (N-of-1) trials can be combined to give population comparative treatment effect estimates: methodologic considerations. J Clin Epidemiology 2010, 63: 1312–1323. 10.1016/j.jclinepi.2010.04.020CrossRef Zucker DR, Ruthazer R, Schmid CH: Individual (N-of-1) trials can be combined to give population comparative treatment effect estimates: methodologic considerations. J Clin Epidemiology 2010, 63: 1312–1323. 10.1016/j.jclinepi.2010.04.020CrossRef
18.
Zurück zum Zitat Collins LM, Murphy SA, Stretcher V: The multiphase optimization strategy (MOST) and the sequential multiple assignment randomized trial (SMART): new methods for more potent e-health interventions. Am J Prev Med 2007, 32: S112-S118. 10.1016/j.amepre.2007.01.022PubMedCentralCrossRefPubMed Collins LM, Murphy SA, Stretcher V: The multiphase optimization strategy (MOST) and the sequential multiple assignment randomized trial (SMART): new methods for more potent e-health interventions. Am J Prev Med 2007, 32: S112-S118. 10.1016/j.amepre.2007.01.022PubMedCentralCrossRefPubMed
19.
Zurück zum Zitat Rivera DE, Pew MD, Collins LM: Using engineering control principles to inform the design of adaptive interventions: a conceptual introduction. Drug Alcohol Depend 2007, 88: S31-S40.PubMedCentralCrossRefPubMed Rivera DE, Pew MD, Collins LM: Using engineering control principles to inform the design of adaptive interventions: a conceptual introduction. Drug Alcohol Depend 2007, 88: S31-S40.PubMedCentralCrossRefPubMed
20.
Zurück zum Zitat Gierisch JM, DeFrank JT, Bowling JM, Rimer BK, Matuszewski JM: Finding the minimal intervention needed for sustained mammography adherence. Am J Prev Med 2010, 39: 334–344. 10.1016/j.amepre.2010.05.020PubMedCentralCrossRefPubMed Gierisch JM, DeFrank JT, Bowling JM, Rimer BK, Matuszewski JM: Finding the minimal intervention needed for sustained mammography adherence. Am J Prev Med 2010, 39: 334–344. 10.1016/j.amepre.2010.05.020PubMedCentralCrossRefPubMed
21.
Zurück zum Zitat Fiore LD, Brophy M, Ferguson RE, D’Avolio L, Hermos JA: A point-of-care clinical trial comparing insulin administered using a sliding scale versus a weight-based regimen. Clin Trials 2011, 8: 183–195. 10.1177/1740774511398368PubMedCentralCrossRefPubMed Fiore LD, Brophy M, Ferguson RE, D’Avolio L, Hermos JA: A point-of-care clinical trial comparing insulin administered using a sliding scale versus a weight-based regimen. Clin Trials 2011, 8: 183–195. 10.1177/1740774511398368PubMedCentralCrossRefPubMed
22.
Zurück zum Zitat Linden A, Adams JL: Applying a propensity score-based weighting model to interrupted time series data: Improving causal inference in programme evaluation. J Eval Clin Pract 2011, 17: 1231–1238. 10.1111/j.1365-2753.2010.01504.xCrossRefPubMed Linden A, Adams JL: Applying a propensity score-based weighting model to interrupted time series data: Improving causal inference in programme evaluation. J Eval Clin Pract 2011, 17: 1231–1238. 10.1111/j.1365-2753.2010.01504.xCrossRefPubMed
23.
24.
Zurück zum Zitat Shadish WR, Galindo R, Wong VC, Steiner PM, Cook TD: A randomized experiment comparing random and cut-off based assignment. Psychol Methods 2011, 16: 179–191.CrossRefPubMed Shadish WR, Galindo R, Wong VC, Steiner PM, Cook TD: A randomized experiment comparing random and cut-off based assignment. Psychol Methods 2011, 16: 179–191.CrossRefPubMed
25.
Zurück zum Zitat Glasgow RE, Magid DJ, Beck A, Ritzwoller D, Estabrooks PA: Practical clinical trials for translating research to practice: design and measurement recommendations. Medical Care 2005, 43: 551–557. 10.1097/01.mlr.0000163645.41407.09CrossRefPubMed Glasgow RE, Magid DJ, Beck A, Ritzwoller D, Estabrooks PA: Practical clinical trials for translating research to practice: design and measurement recommendations. Medical Care 2005, 43: 551–557. 10.1097/01.mlr.0000163645.41407.09CrossRefPubMed
26.
Zurück zum Zitat Solberg LI, Glasgow RE, Unutzer J, Jaeckels N, Oftedahl G: Partnership research: a practical trial design for evaluation of a natural experiment to improve depression care. Med Care 2010, 48: 576–582. 10.1097/MLR.0b013e3181dbea62PubMedCentralCrossRefPubMed Solberg LI, Glasgow RE, Unutzer J, Jaeckels N, Oftedahl G: Partnership research: a practical trial design for evaluation of a natural experiment to improve depression care. Med Care 2010, 48: 576–582. 10.1097/MLR.0b013e3181dbea62PubMedCentralCrossRefPubMed
31.
Zurück zum Zitat Singh B: Innovation and challenges in funding rapid research responses to emerging infectious diseases: lessons learned from the outbreak of severe acute respiratory syndrome. Infect Dis Med Microbiol 2004, 15: 167–170. Singh B: Innovation and challenges in funding rapid research responses to emerging infectious diseases: lessons learned from the outbreak of severe acute respiratory syndrome. Infect Dis Med Microbiol 2004, 15: 167–170.
34.
Zurück zum Zitat Gupta KC: What does the marriage of Open Access and online publication bring? AIDS Res Ther 2004, 14: 1.CrossRef Gupta KC: What does the marriage of Open Access and online publication bring? AIDS Res Ther 2004, 14: 1.CrossRef
36.
Zurück zum Zitat Etheredge LM: A rapid-learning health system. Heal Aff 2007, 26: W107-W118. 10.1377/hlthaff.26.2.w107CrossRef Etheredge LM: A rapid-learning health system. Heal Aff 2007, 26: W107-W118. 10.1377/hlthaff.26.2.w107CrossRef
37.
Zurück zum Zitat Moores K, Gilchrist B, Carnahan R, Abrams T: A systematic review of validated methods for identifying pancreatitis using administrative data. Pharmacoepidemiol Drug Saf 2012, 21(Suppl 1):194–202.CrossRefPubMed Moores K, Gilchrist B, Carnahan R, Abrams T: A systematic review of validated methods for identifying pancreatitis using administrative data. Pharmacoepidemiol Drug Saf 2012, 21(Suppl 1):194–202.CrossRefPubMed
42.
Zurück zum Zitat Hippisley-Cox J, Coupland C: Unintended effects of statins in men and women in England and Wales: population-based cohort study using the QResearch database. BMJ 2010, 340: c2197. 10.1136/bmj.c2197PubMedCentralCrossRefPubMed Hippisley-Cox J, Coupland C: Unintended effects of statins in men and women in England and Wales: population-based cohort study using the QResearch database. BMJ 2010, 340: c2197. 10.1136/bmj.c2197PubMedCentralCrossRefPubMed
43.
Zurück zum Zitat Tannen RL, Weiner MG, Xie D: Use of primary care electronic medical record database in drug efficacy research on cardiovascular outcomes: comparison of database and randomized controlled trial findings. BMJ 2009, 338: b81. 10.1136/bmj.b81CrossRefPubMed Tannen RL, Weiner MG, Xie D: Use of primary care electronic medical record database in drug efficacy research on cardiovascular outcomes: comparison of database and randomized controlled trial findings. BMJ 2009, 338: b81. 10.1136/bmj.b81CrossRefPubMed
47.
Zurück zum Zitat Hamilton CM, Strader LC, Pratt JG, Maiese D, Hendershot T: The PhenX Toolkit: get the most from your measures. Am J Epidemiol 2011, 174: 253–260. 10.1093/aje/kwr193PubMedCentralCrossRefPubMed Hamilton CM, Strader LC, Pratt JG, Maiese D, Hendershot T: The PhenX Toolkit: get the most from your measures. Am J Epidemiol 2011, 174: 253–260. 10.1093/aje/kwr193PubMedCentralCrossRefPubMed
48.
Zurück zum Zitat Noonan VK, Cook KF, Bamer AM, Choi SW, Kim J, Amtmann D: Measuring fatigue in persons with multiple sclerosis: creating a crosswalk between the Modified Fatigue Impact Scale and the PROMIS Fatigue Short Form. Qual Life Res 2012, 21: 1123–1133. 10.1007/s11136-011-0040-3CrossRefPubMed Noonan VK, Cook KF, Bamer AM, Choi SW, Kim J, Amtmann D: Measuring fatigue in persons with multiple sclerosis: creating a crosswalk between the Modified Fatigue Impact Scale and the PROMIS Fatigue Short Form. Qual Life Res 2012, 21: 1123–1133. 10.1007/s11136-011-0040-3CrossRefPubMed
49.
Zurück zum Zitat National Research Council: Toward Precision Medicine: Building a New Knowledge Network for Biomedical Research and a New Taxonomy of Disease. Washington DC: National Academies Press; 2011. National Research Council: Toward Precision Medicine: Building a New Knowledge Network for Biomedical Research and a New Taxonomy of Disease. Washington DC: National Academies Press; 2011.
Metadaten
Titel
Rapid, responsive, relevant (R3) research: a call for a rapid learning health research enterprise
verfasst von
William T Riley
Russell E Glasgow
Lynn Etheredge
Amy P Abernethy
Publikationsdatum
01.12.2013
Verlag
Springer Berlin Heidelberg
Erschienen in
Clinical and Translational Medicine / Ausgabe 1/2013
Elektronische ISSN: 2001-1326
DOI
https://doi.org/10.1186/2001-1326-2-10

Weitere Artikel der Ausgabe 1/2013

Clinical and Translational Medicine 1/2013 Zur Ausgabe