Secondary Logo

Journal Logo

Original Article

An implementation science primer for psycho-oncology: translating robust evidence into practice

Rankin, Nicole M.a,b,∗; Butow, Phyllis N.c; Hack, Thomas F.d; Shaw, Joanne M.c; Shepherd, Heather L.c; Ugalde, Annae; Sales, Anne E.f

Author Information
Journal of Psychosocial Oncology Research and Practice: December 2019 - Volume 1 - Issue 3 - p e14
doi: 10.1097/OR9.0000000000000014
  • Open


1 Introduction

The discipline of psycho-oncology covers a diverse range of research and practice efforts that seek to ameliorate the psychological, social, and emotional sequalae following a person's diagnosis of cancer.[1] A significant research focus of the discipline is the development of evidence-based interventions, health programs, and innovations (hereafter “interventions”) to address the needs of patients and caregivers across a wide spectrum of cancer diagnoses, across anti-cancer treatments, and through to palliative care and survivorship. However, many evidence-based psychosocial interventions are not integrated into routine clinical care.[2,3]

There are significant opportunities for psycho-oncology researchers and clinicians to engage in a broader agenda to conduct implementation research. This will enable knowledge outputs from the discipline to benefit the broader populations of people diagnosed with cancer and their caregivers. The discipline has been striving for greater integration between research and practice over the last 2 decades[4,5] and it is acknowledged that implementation is the next global challenge for psycho-oncology.[6–9] There is little guidance available for psycho-oncology researchers and clinicians about implementation science and how to optimally accelerate the translation of evidence into routine practice.[7] Key texts and introductory guides to implementation science are widely available, but not tailored for a psycho-oncology audience.[10–12]

Thus, the aim of this article is to provide a primer in implementation science for psycho-oncology professionals and describe current approaches to timely implementation. We introduce core concepts and principles of implementation science and provide examples from the psycho-oncology and broader oncology literature to help operationalize concepts. We conclude by identifying ways to accelerate translational research in psycho-oncology and highlight opportunities for interdisciplinary collaboration.

1.1 What is implementation science?

Implementation science is an emergent discipline within the broader spectrum of translational research[10] (Fig. 1). Implementation science is defined as “the scientific study of methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice, and, hence, to improve the quality and effectiveness of health services. It includes the study of influences on healthcare professional and organisational behavior.”[13] Definitions and glossaries of terms help nonexperts to navigate the distinctions between dissemination and implementation research and similar terms that are used across different settings.[14–16] Research questions are typically focused on the “how,” “why,” and “who” of implementation, rather than the creation and testing of new interventions.[17] Examples of questions include: “how can we reduce the gap in the quality of care delivered to patients?”; “how can we implement an effective intervention in real-world settings?”; and “how can we implement strategies to help facilitate successful implementation?”[10]

Figure 1
Figure 1:
Stages of research and phases of dissemination and implementation (Landsverk et al).

A goal of implementation science is to create generalizable knowledge that can be replicated across different settings and contexts,[16,17] where context is broadly defined as “the environment or setting in which the proposed change is to be implemented.”[18] This is in contrast with basic sciences and clinical research, wherein an intervention is tested under tightly controlled conditions and information is gathered about internal validity.[17] Implementation science is concerned with the application of interventions in real-world settings and external validity, that is, to what populations, settings, treatment and measurable variables can the effects of an intervention be generalised?.[11,19] Although the discipline is inclusive of implementation practice (studies about the “doing” or “how to” of implementation), the focus here is on implementation research, which is the scientific study of understanding how and why implementation succeeds or fails in real-world settings.[14,17] Practice examples in psycho-oncology include a distress-screening implementation study by Lazenby et al[20] and the psychosocial evaluation matrix study by Forsythe et al.[21] An example of implementation research includes an examination of the mechanisms that facilitate the uptake of audio-recordings of oncology consultations by Hack et al,[22,23] which has a well-established evidence base but poor uptake in practice.

1.2 Where does implementation science fit within translational research?

The term “translational research” has multiple definitions and meanings to various audiences in health and medical sciences.[24] We use the National Institute of Health's definition that outlines 2 primary areas of translational research. The first is “bench to bedside” translation, in which basic science and preclinical discoveries (T0) are carried into human subject research (T1), and subsequently translated through clinical trial results and other research findings (T2) to the bedside.[25] The second area, known as “bedside to community” translation, is concerned with how knowledge is translated into practice in clinical (T3) settings. More recent translational models incorporate a “T4” component and, whereas precise definitions of T4 are lacking; it encapsulates the effectiveness and outcomes of interventions at a community or population level.[24] This broad scope includes the vast, related literature on knowledge translation.[26] Implementation science is considered to fit within the space of T3 and T4 (Fig. 2).[27] Quality improvement and the improvement sciences also fall within the T3 component but are focused on improving health system performance and managing change.[17]

Figure 2
Figure 2:
The Clinical and Translational Science Award (CTSA) program, US National Institutes of Health 2013, based on Blumberg (2012).

1.3 A translational perspective for behavioral interventions

The linear “pipeline” model of translational research[28] may have less resonance with the psycho-oncology community, where most interventions do not originate in the laboratory and are not tested on nonhuman subjects. Alternative models from the behavioral sciences have emerged,[29] including a relational model described by Gitlin and Czaja.[30] In this model, translational research consists of 3 components: behavioral intervention research, implementation science, and practice change, with each component interacting and influencing the other.[30] This model can be readily adapted to the psycho-oncology setting, where researchers and clinicians develop evidence-based behavioral interventions and evaluate their impact on practice change. Implementation science is viewed as the bridge between the stages of intervention development research (the designing and testing of an intervention) and practice (or behavior) change.

1.4 Core components of implementation science

1.4.1 Is there a care or quality gap?

Research questions in implementation science begin by acknowledging a gap between scientific discoveries or evidence and their translation into clinical practice, known as the “evidence-practice gap.”[11,31] This lack of translation of interventions into clinical practice has been widely documented and scholars have noted that “it takes 17 years to turn 14 per cent of original research to the benefit of patient care.”[28,32] Numerous factors contribute to this gap. Implementation researchers frequently begin with an analysis that identifies the barriers and facilitators (also termed “enablers”) across multiple layers including individual, team, organizational, and social levels.[33] Barrier and facilitators are also described as the “determinants of change,”[34] which can assist in understanding how individuals’ collective beliefs, attitudes, knowledge, and motivation can impede or facilitate behavior change. Implementation scientists are particularly interested in the determinants encountered at the health service or health system levels.[35]

In the psycho-oncology literature, barriers and facilitators are documented about the challenges faced in integrating psychosocial services into cancer services, as well as specific gaps relevant to implementing specialist psycho-oncology interventions. A study by Schofield et al[36] identified barriers and enablers to delivering psychosocial and supportive care within the broader oncology clinic setting using the PRECEDE-PROCEED framework.[37] They describe predisposing, enabling, and reinforcing factors that contribute to the evidence-quality gap. Predisposing factors include clinicians’ lack of knowledge about the benefit of psychosocial care, beliefs, and attitudes and a lack of self-efficacy in identifying psychological distress. Enabling factors include time (such as clinicians briefly engaging and responding to emotional and informational cues), assessment skills, and systems. Reinforcing factors include feedback on performance, rewards for performance and negative consequences. Hack et al identified critical and common barriers to implementation of psycho-oncology and supportive care interventions. The critical barriers include evidence (eg, inconsistent empirical evidence), lack of champions to implement activities, as well as a lack of dedicated resources, and administrative staff.[33] An evidence-practice gap that has gained significant attention in psycho-oncology is distress screening. Far less has been documented about the barriers experienced for specialist psycho-oncology clinicians and the delivery of interventions in practice. The challenges that practitioners experience in locating, assessing, interpreting, and applying evidence has been noted[33] as well as issues in training.[38,39]

1.4.2 A solid evidence-base for interventions

Implementation science requires a strong evidence base that demonstrates the efficacy of an intervention, which warrants further scientific investigation and application in real-world settings.[31] Typically, interventions are underpinned by randomized efficacy trials and effectiveness studies, are cited in systematic reviews and/or meta-analyses, or are included in evidence summaries, such as clinical practice guidelines.[31] The Cochrane Collaboration's Effective Practice and Organisation of Care (EPOC) criteria are helpful to assess whether an intervention is sufficiently ready for broader dissemination and implementation.[40]

We propose that there are some significant challenges for the psycho-oncology community to address with regards to the evidence base. First, much effort in psycho-oncology has focused on discovery and initial development and testing of psychosocial interventions to show the benefits of providing these interventions in the routine care of patients and carers. Much of the literature is descriptive, and, although this is useful as a precursor to intervention development, many studies have not been prepared or published with translation in mind. A 2012 review of the supportive care needs literature, Carey et al found that <6% of studies aimed to close the gap between evidence and clinical practice[41]; by 2019, little had changed, with Sanson-Fisher et al identifying only 5 intervention trials in the discipline's leading journals.[42] A second challenge is concerned with the quality of interventions that have been tested.[2] Many studies suffer from lack of scientific rigor, weak or inappropriate study design, and an inappropriate selection of target populations resulting in small effect sizes.[16,43,44] Studies need to identify the active components of interventions to enable replication, including aspects of delivery, frequency, dose, and defining where active elements can be adapted or modified in real-world settings.[15] A third challenge is the lack of pragmatic interventions that can be implemented across diverse settings.[42] Interventions that are highly resource-intensive, very expensive to deliver, or that have poor levels of acceptability or feasibility with the intended population are unlikely to be suited for wide-scale implementation.

1.4.3 Conceptual model, framework or theoretical justification

Implementation science considers it essential to use ≥1 conceptual models, frameworks, or theories to support implementing interventions into practice. These terms are often used “interchangeably and imprecisely,” which can lead to confusion.[16] There are numerous frameworks accompanied by limited advice on how best to make a selection for an implementation study.[45,46] Nilsen (2015) presents a taxonomy of theories, models, and frameworks, noting that there are overlaps across these. Three are particularly relevant for understanding the core of implementation science: process models; determinant frameworks; and evaluation frameworks.[47] We briefly summarise these and provide some examples from the psycho-oncology and supportive care literature.

Process models help to guide and organize the processes of translating research into practice. These models address stages of deliberate preimplementation planning, testing strategies in the implementation phase, and evaluating outcomes.[47] Examples include the Knowledge-to-Action (KTA) framework developed by Graham et al[48] and widely disseminated by the Canadian Institutes of Health Research, and a stepped model by Grol et al[49] from improvement sciences. The KTA framework has been used by Nadler et al to explore oncology care providers’ perspectives on promoting exercise to people diagnosed with cancer.[50]

Determinant frameworks describe factors that could influence implementation outcomes and typically comprise barriers and facilitators that help guide the selection of implementation strategies; most come from empirical study. Some examples of well-utilized determinant frameworks including the Theoretical Domains Framework, the Consolidated Framework for Implementation (CFIR),[51] and the PARIHS framework.[52,53] The most recent consolidated framework is the Tailored Implementation for Chronic Disease Checklist, which includes the Theoretical Domains Framework and the Consolidated Framework for Implementation Research.[54] Example of studies that use the PARIHS framework are by Hack et al[22] (as described above) and by Tian et al[55] to examine the management of cancer-related fatigue by nurses. The CFIR has been used extensively across the cancer control spectrum and studies can be located via the CFIR website.[56]

Evaluation frameworks provide a “structure for evaluating implementation endeavours.”[47] Examples include RE-AIM,[57,58] the Predisposing, Reinforcing, and Enabling Constructs in Educational Diagnosis and Evaluation-Policy, Regulatory, and Organizational Constructs in Educational and Environmental Development (PRECEDE-PROCEED),[37] as described in Schofield's study earlier, and Proctor et al's Implementation Outcomes framework.[59] A current example that illustrates use of the RE-AIM framework to evaluate an implementation study is the “Healthy Living After Cancer” program, which will evaluate feasibility and costs of wide-scale implementation of a cancer survivorship program across Australia.[60]

The Nilsen taxonomy includes 2 remaining theories: classic theories (that originate from related disciplines such as psychology, sociology, and organizational theory) and implementation theories.[47] Classic theories are frequently used in intervention development to explain behavior change (such as Bandura's theory of self-efficacy and Social Cognitive theories) and can also help interpret or explain aspects of implementation. Implementation theories are from within the discipline and similarly assist in understanding aspects of implementation. These theories contribute to development of complex intervention development and testing these in health services.[61]

Implementation scientists base their selection of a framework or model on the clinical problem (or research question) and the stage of implementation. It is often useful to use a process framework for planning implementation; a determinants framework to design or select implementation strategies; and an evaluation framework to support evaluation of the effectiveness of implementation. Choices may also be guided by whether there are relevant measures that can help to operationalize the constructs of inquiry[45] and if newly-developed tools are emerging to support selection.[62] Some research studies will use a combination of different models and frameworks to assist in design and the interpretation of findings.

1.4.4 Evaluation study design

To evaluate the effectiveness of implementation efforts, implementation scientists draw on a wide range of study designs from qualitative and quantitative traditions; different designs are used across exploratory, pre-implementation preparation planning, implementation, evaluation, and sustainment phases.[63] The designs selected are informed by the research question and the focus is usually at the health service (the team or organization) or health systems’ level rather than the individual patient.[42] This point of difference can be challenging for researchers and clinicians who are trained in designing efficacy studies where research focuses on the patient population, intervention, comparator, and outcomes. Common study designs used to assess the effectiveness of implementation efforts include cluster randomized controlled trials, stepped wedge designs, interrupted time series, and controlled pre-post designs.[10,64–66] These types of study designs are particularly relevant when conducting studies of complex interventions in health services or systems. Qualitative and mixed methods are used extensively, and the discipline also draws on designs from the literature on process evaluations, observational designs, and health economics.[49]

1.4.5 Implementation strategies

Concomitant with development of frameworks, implementation strategies are used to guide planned behavior change at the individual level as well as system-level changes (eg, at the department or unit, organizational, and societal levels). These are enumerated in several different papers and taxonomies, many of which overlap. A parsimonious introduction to implementation strategies is provided in Grimshaw et al, which provides a brief description of 12 commonly used strategies, along with what is known about their effectiveness in supporting implementation of evidence-based practices[67] and include educational strategies (printed materials, meetings, outreach), audit and feedback, local opinion leaders and computerized reminders. A more extensive list is provided by Powell et al, which is organized alphabetically.[68] Most of these strategies are included in the Effective Practice Organization of Care Cochrane group taxonomy of implementation strategies, with strategies grouped by domains.[69] These strategies have also been topics of systematic reviews and meta-analyses, many published by the Cochrane Collaboration. Implementation strategies are a primary focus for the discipline and standards for reporting implementation studies (StaRI) have been recently published.[70,71]

There is relatively little research published about implementation strategies within the context of psycho-oncology. A narrative review of 11 published systematic reviews (covering 495 original articles of mixed study designs) was published by Rabin and Glasgow in the broad areas of psychology and cancer.[12] They note the poor quality of studies, a limited use of theory, and that few report on implementation outcomes. Identified implementation strategies focused on the distribution of educational materials patient/consumers and health care providers using outreach, audit and feedback, opinions leaders, and reminder systems. More recently, a systematic review by McCarter et al sought to determine the effectiveness of strategies to improve clinician provision of psychosocial distress screening and referral of patients with cancer.[72] Five studies were identified and most had poor quality ratings; only 1 study reported a significant improvement in psychosocial referrals using specified implementation strategies. Clearly, there is significant opportunity to investigate implementation strategies in a robust manner and test these using robust methodologies,[73] rather than reporting on descriptive accounts of their use in specific settings.[74]

1.4.6 Implementation outcomes

Implementation science focuses on those outcomes that are measured to demonstrate successful implementation. Implementation outcomes are defined by Proctor et al as “the effects of deliberate and purposive actions to implement new treatments, practices, and services,”[59] whereas the definition in the StaRI statement is a “process or quality measure to assess the impact of the implementation strategy.”[70] Thus, outcomes are typically focused on the processes of implementing an intervention, which is in contrast to effectiveness studies that focus on primary outcomes of the intervention (eg, clinical, patient, or services outcomes). Implementation research includes proximal outcomes (eg, measuring organizational readiness for change, adoption of an evidence-based intervention) and distal outcomes (eg, sustainability of an intervention over time).[11] Core implementation outcomes that were identified by Proctor et al through a narrative literature review include: acceptability, appropriateness, adoption, feasibility, fidelity, implementation costs, penetration (elsewhere labeled as integration), and sustainability.[59] A challenge for the discipline is the development and validation of instruments to measure these outcomes[75] and to help researchers and clinicians to select appropriate outcomes and measures to evaluate implementation “success.” It is important to measure implementation outcomes, as another guiding principle of implementation science is to distinguish whether success (or failure) is because of the intervention or the implementation strategies selected.[71]

In the next section, we describe key components that should be considered before commencing any implementation study.

1.4.7 Team expertise

The implementation science boundaries of the “team” extend beyond the traditional ones common to many scientific disciplines. Implementation research draws on teams with diverse expertise that can contribute to different aspects of study design and selection of implementation strategies.[31] This means working collaboratively with health services researchers, health economists, and other specialist disciplines such as sociology, organizational, behavioral, and managerial specialists, as well as health service staff and policy makers.[76] The related literature about “team science” emphazises the importance of developing a common language to facilitate research endeavors across disciplines.[77,78] Psycho-oncology clinicians and researchers will be familiar with working in a multidisciplinary setting in cancer care services.[74] Collaborations across broad disciplinary teams can foster complex trial interventions in psycho-oncology. A recent example by Butow et al[73] engages researchers, clinicians, health economists, and health service staff across jurisdictions to test implementation strategies for a clinical pathway for anxiety and depression in 12 cancer services. Other researchers have noted the need to engage specialist psycho-oncology staff to support implementation.[79]

1.4.8 Engaging stakeholders throughout implementation research

Implementation scientists view stakeholder engagement as a necessity throughout the research process.[11,80] Researchers are increasingly expected to engage with stakeholders as part of good research practice, and methods for doing so are rapidly evolving.[81–84] Community-based participatory research principles include collaborative engagement during the stages of intervention and study design.[80] Implementation research in low- and middle-income countries places a particularly strong emphasis on the essential need to build trust and capacity with stakeholders across intervention design and selection of implementation strategies.[85]

The consumer advocacy movement and engagement with survivors are strengths of cancer control in many communities. Researchers are encouraged not to assume that full participation and engagement have been already been achieved in the design of psycho-oncology research studies. Two recent systematic reviews highlight the need for greater engagement. A systematic review of cancer caregiver interventions by Ugalde et al found that 65% of 26 studies included no evidence of caregiver involvement in intervention development.[86] A review of survivorship care plans by Keesing et al showed a lack of stakeholder consensus developing the core features of plans.[87] Without stakeholder engagement, there appears to be greater likelihood that interventions will fail because of lack of acceptability or uptake.[88]

1.4.9 Assessing readiness for implementation and engaging in change processes

Implementation science recognizes the interplay of individual behavior change (typically of providers, as well as patients and other stakeholders) and organizational change processes. A key concept related to this aspect of implementation research is organizational readiness for change, which is defined as a multilevel and multifaceted construct that refers to organizational members’ change commitment and efficacy to implement change.[89] A number of instruments have been developed to assess organizational readiness to change, which focus on individual perceptions of their organizational environment.[90] These tools may be useful for inclusion in the planning stages of implementation research.

1.5 Opportunities to accelerate the implementation of evidence-based psychosocial interventions

In light of the lengthy time lags between discovery and routine implementation of interventions, there is considerable interest from health funders and policy makers in how best to accelerate the translation of research into practice.[91] This is driven by concerns about the significant amount of money invested in research and a lack of return on investment in seeing improvements in patient care. In this section, we address 3 broad areas for consideration to accelerate the implementation of psychosocial and supportive care interventions into cancer care.

1.5.1 Applying what we already know

The lack of translation of interventions with demonstrated efficacy can be observed across the cancer continuum. In the case of cancer prevention, for example, the World Health Organisation notes that about one-third of all cancers could be prevented through interventions focused on smoking cessation, increased physical activity, and addressing obesity and overweight.[92] Colditz et al[93] provide insightful examples of how obstacles in applying what is already known in cancer prevention might be overcome. Similarly, psychosocial interventions for cancer patients and caregivers with demonstrated efficacy in a research setting need to be tested in real-world settings[2]; this requires a paradigm shift within psycho-oncology to generate new research questions about how interventions can be scaled up and spread to new settings or patient populations. Thus, the nature of the research question changes from ‘is this intervention effective?” to “how can we successfully implement this intervention with this target population?”—but this is only relevant if interventions are shown to be effective. Another aspect of this paradigm shift is to place more effort in the robust evaluation of programs in different settings. For example, an intervention developed for early-stage or curative cancers will need to be carefully evaluated for relevance and ability to be translated to the advanced cancer population, where existential issues are more prominent.[94]

A common barrier to applying existing interventions is that both researchers and clinicians experience challenges in finding, assessing, interpreting, and applying best evidence.[33] We consider that this barrier extends to challenges in accessing the details of an intervention or therapeutic program (eg, access to training manuals, data collection tools, and other intellectual outputs) to enable replication or adaptation for a new settings. One solution is the Research-Tested Intervention Programs searchable database of cancer control interventions and programs that are ready for translation.[95] This initiative is funded by the US National Cancer Institute and includes 18 supportive care and survivorship programs. The psycho-oncology community could readily access and make significant contributions to this initiative as a centralized means of facilitating the spread and scaling-up of interventions.[96]

1.5.2 Innovative research designs

There is significant enthusiasm for using innovative research designs in implementation science. Hybrid designs are of particular interest and are defined by Curran et al as taking “a dual focus a priori in assessing clinical effectiveness and implementation.”[97] These pragmatic research designs seek to simultaneously gather data about health outcomes by examining both the intervention and the implementation strategies used to promote uptake.[98] Hybrid designs are employed following the completion of efficacy studies and do not“short-cut” the developmental steps of intervention design and pilot-testing. Three types of hybrid designs are outlined in Table 1,[99–104] which includes examples of study protocols and published studies. By combining research aims that address both the intervention and implementation, researchers and clinicians using hybrid designs can report results about how to move the intervention into practice or policy.[105] Psycho-oncology researchers and clinicians could readily use hybrid designs to conduct large-scale multicenter trials and report on intervention effectiveness and the outcomes of implementation strategies.[7]

Table 1
Table 1:
Classification of hybrid designs and relevant examples from the literature.

A second innovation is the “scaling out” of existing interventions. Aarons et al propose this concept as the “deliberate use of strategies to implement, test, improve, and sustain interventions as they are delivered in novel circumstances distinct from, but closely related to, previous implementations.”[106] Scaling out can involve adapting interventions to either new populations, new delivery systems, or both; cancer studies that are beginning to appear in the literature include the “scale out” of a breast cancer screening program to a broader population group.[107] Although further testing and refinement of this innovation are needed, we suggest that scaling out could be highly relevant to psycho-oncology. This type of research design could move existing interventions to new patient groups (eg, different tumor types, culturally diverse communities) or testing interventions designed for outpatient oncology settings to primary or community care settings. An obvious starting point is to adapt psychosocial interventions designed for patients with breast cancer to other patient groups with different cancer diagnoses.

1.5.3 Accelerating change through investment in translational research

Many developed countries are making greater investment in funding translational T3 and T4 research. The United States has made significant investments in the clinical and translational sciences and dissemination and implementation research in the past 2 decades,[108,109] as has Canada in knowledge translation.[110] Other countries including the United Kingdom, the Netherlands, Denmark, and Australia are dedicating more resources to translational research that directly impacts patient outcomes. However, lack of transparency from some major funders and philanthropic or private foundations is a reported barrier.[111] It may be premature to show that investment in translational research will accelerate greater use of research in clinical practice.[112,113] However, as the psycho-oncology community rises to the global challenge of prioritizing implementation research, we must consider what strategies or mechanisms might help foster greater investment. This includes demonstrating the economic benefits of psychological interventions in reducing health care costs and improving quality of life.[114–117] We need champions who will advocate for dedicated funding of large-scale implementation trials to test interventions with different populations.

1.5.4 Future opportunities for collaboration

We believe the time is ripe for building capacity and conducting interdisciplinary research across psycho-oncology and implementation science. Areas for collaboration include greater use of theory-driven approaches to evaluate the implementation of psychosocial interventions. Attention should focus on identifying the implementation strategies that are most likely to lead to successful uptake in practice of psychosocial and supportive care interventions. Future collaborative research should evaluate complex interventions in different health systems and draw on the study designs and outcomes described above, inclusive of collaboration with key stakeholders. We also advocate for significant investment in education and training in implementation science for psycho-oncology researchers and clinicians.

This article has described the fundamental concepts and principles of implementation science for a psycho-oncology audience, to increase the number and quality of implementation studies. As the discipline of psycho-oncology engages in the global challenge of implementing robust evidence-based treatments into routine clinical practice, the use of innovative approaches can help accelerate change and promote better integration of psychosocial interventions across the cancer care continuum.

Conflicts of interest statement

The authors declare that they have no financial conflict of interest with regard to the content of this report.


[1]. Holland JC. History of psycho-oncology: overcoming attitudinal and conceptual barriers. Psychosom Med 2002;64:206–221.
[2]. Andersen BL, Dorfman CS. Evidence-based psychosocial treatment in the community: considerations for dissemination and implementation. Psychooncology 2016;25:482–490.
[3]. Jacobsen PB, Norton WE. The role of implementation science in improving distress assessment and management in oncology: a commentary on “Screening for psychosocial distress among patients with cancer: implications for clinical practice, healthcare policy, and dissemination to enhance cancer survivorship”. Transl Behav Med 2019;9:292–295.
[4]. Dunn J, Bultz BD, Watson M. Emerging international directions for psychosocial care. In: Holland JC, Breitbart WS, Jacobsen PB, Loscalzo MJ, McCorkle R, Butow PN, eds. Psycho-Oncology. 3rd ed. Oxford: Oxford University Press; 2015.
[5]. Corner J. Interface between research and practice in psycho-oncology. Acta Oncol 1999;38:703–707.
[6]. Rodin G. From evidence to implementation: the global challenge for psychosocial oncology. Psychooncology 2018;27:2310–2316.
[7]. Jacobsen PB. New Challenges in Psycho-Oncology Research II: a health care delivery, dissemination, and implementation research model to promote psychosocial care in routine cancer care. Psychooncology 2017;26:419–423.
[8]. Jacobsen PB, Lee M. Integrating psychosocial care into routine cancer care. Cancer Control 2015;22:442–449.
[9]. Holland JC. Introduction: History of Psycho-Oncology. In: Psycho-Oncology New York: Oxford University Press, 2015.
[10]. Brownson RC, Colditz GA, Proctor EK. Dissemination and Implementation Research in Health. New York: Oxford; 2017.
[11]. Neta G, Brownson RC, Chambers DA. Opportunities for epidemiologists in implementation science: a primer. Am J Epidemiol 2018;187:899–910.
[12]. Rabin B, Glasgow RE. An implementation science perspective on psychological science and cancer: what is known and opportunities for research, policy, and practice. Am Psychol 2015;70:211–220.
[13]. Eccles MP, Mittman BS. Welcome to Implementation Science. Implement Sci 2006;1:1–11.
[14]. Rabin BA, Brownson RC, Haire-Joshu D, Kreuter MW, Weaver NL. A Glossary for dissemination and implementation research in health. J Public Health Manag Pract 2008;14:117–123.
[15]. Rabin BA, Brownson RC. Brownson RC, Colditz G, Proctor E. Terminology for dissemination and implementation research. Dissemination and Implementation Research in Health: Translating Science to Practice 2nd edNew York: Oxford; 2017.
[16]. Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM. An introduction to implementation science for the non-specialist. BMC Psychol 2015;3:32.
[17]. Livet M, Haines ST, Curran GM, et al. Implementation science to advance care delivery: a primer for pharmacists and other health professionals. Pharmacotherapy 2018;38:490–502.
[18]. May CR, Johnson M, Finch T. Implementation, context and complexity. Implement Sci 2016;11:141.
[19]. Green LW, Nasser M. Brownson RC, Colditz GA, Proctor EK. Furthering dissemination and implementation research: the need for more attention to external validity. Dissemination and Implementation Research in Health New York: Oxford; 2017.
[20]. Lazenby M, Ercolano E, Knies A, et al. Psychosocial distress screening: an educational program's impact on participants’ goals for screening implementation in routine cancer care. Clin J Oncol Nurs 2018;22:E85–e91.
[21]. Forsythe LP, Rowland JH, Padgett L, et al. The cancer psychosocial care matrix: a community-derived evaluative tool for designing quality psychosocial cancer care delivery. Psychooncology 2013;22:1953–1962.
[22]. Hack TF, Ruether JD, Weir LM, Grenier D, Degner LF. Promoting consultation recording practice in oncology: identification of critical implementation factors and determination of patient benefit. Psychooncology 2013;22:1273–1282.
[23]. Rieger KL, Hack TF, Beaver K, Schofield P. Should consultation recording use be a practice standard? A systematic review of the effectiveness and implementation of consultation recordings. Psychooncology 2018;27:1121–1128.
[24]. Khoury MJ, Clauser SB, Freedman AN, et al. Population sciences, translational research, and the opportunities and challenges for genomics to reduce the burden of cancer in the 21st century. Cancer Epidemiol Biomarkers Prev 2011;20:2105–2114.
[25]. Fort DG, Herr TM, Shaw PL, Gutzman KE, Starren JB. Mapping the evolving definitions of translational research. J Clin Transl Sci 2017;1:60–66.
[26]. Straus SE, Tetroe J, Graham I. Defining knowledge translation. CMAJ 2009;181:165–168.
[27]. Blumberg RS, Dittel B, Hafler D, von Herrath M, Nestle FO. Unraveling the autoimmune translational research process layer by layer. Nat Med 2012;18:35–41.
[28]. Green LW. Making research relevant: if it is an evidence-based practice, where's the practice-based evidence? Fam Pract 2008;25 (supp 1):i20–i24.
[29]. Hommel KA, Modi AC, Piazza-Waggoner C, Myers JD. Topical review: translating translational research in behavioral science. J Pediatr Psychol 2015;40:1034–1040.
[30]. Gitlin LN, Czaja SJ. Behavioral intervention research: designing, evaluating, and implementing. New York: Springer; 2016.
[31]. Proctor EK, Powell BJ, Baumann AA, Hamilton AM, Santens RL. Writing implementation research grant proposals: ten key ingredients. Implement Sci 2012;7:96–196.
[32]. Weingarten S, Garb CT, Blumenthal D, Boren SA, Brown GD. Improving preventive care by prompting physicians. Arch Intern Med 2000;160:301–308.
[33]. Hack TF, Carlson L, Butler L, et al. Facilitating the implementation of empirically valid interventions in psychosocial oncology and supportive care. Support Care Cancer 2011;19:1097–1105.
[34]. Wensing M, Bosch M, Grol R. Determinants of change. In: Grol R, Wensing M, Eccles MP, Davis D, eds. Improving patient care: The implementation of change in health care. West Sussex, United Kingdom: Wiley Blackwell; 2013.
[35]. Geerligs L, Rankin NM, Shepherd HL, Butow P. Hospital-based interventions: a systematic review of staff-reported barriers and facilitators to implementation processes. Implement Sci 2018;13:36.
[36]. Schofield P, Carey M, Bonevski B, Sanson-Fisher R. Barriers to the provision of evidence-based psychosocial care in oncology. Psychooncology 2006;15:863–872.
[37]. Green LW, Kreuter MW. Health Program Planning: An Educational and Ecological Approach. 4th ed.New York: McGraw-Hill; 2005.
[38]. Andersen BL, Ryba MM, Brothers BM. Implementation of an evidence-based biobehavioral treatment for cancer patients. Transl Behav Med 2017;7:648–656.
[39]. Brothers BM, Carpenter KM, Shelby RA, et al. Dissemination of an evidence-based treatment for cancer patients: training is the necessary first step. Transl Behav Med 2014;5:103–112.
[40]. Bero L, Deane K, Eccles M, et al. About The Cochrane Collaboration: Cochrane Effective Practice and Organisation of Care Review Group (Cochrane Group Module). Oxford 2009.
[41]. Carey M, Lambert S, Smits R, Paul C, Sanson-Fisher R, Clinton-McHarg T. The unfulfilled promise: a systematic review of interventions to reduce the unmet supportive care needs of cancer patients. Support Care Cancer 2012;20:207–219.
[42]. Sanson-Fisher R, Hobden B, Watson R, et al. The new challenge for improving psychosocial cancer care: shifting to a system-based approach. Support Care Cancer 2019;27:763–769.
[43]. Faller H, Schuler M, Richard M, Heckl U, Weis J, Kuffner R. Effects of psycho-oncologic interventions on emotional distress and quality of life in adult patients with cancer: systematic review and meta-analysis. J Clin Oncol 2013;31:782–793.
[44]. Shaw JM, Sekelja N, Frasca D, Dhillon HM, Price MA. Being mindful of mindfulness interventions in cancer: a systematic review of intervention reporting and study methodology. Psychooncology 2018;27:1162–1171.
[45]. Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med 2012;43:337–350.
[46]. Birken SA, Powell BJ, Shea CM, et al. Criteria for selecting implementation science theories and frameworks: results from an international survey. Implement Sci 2017;12:124.
[47]. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci 2015;10:53.
[48]. Graham ID, Logan J, Harrison MB, et al. Lost in knowledge translation: time for a map? J Contin Educ Health Prof 2006;26:13–24.
[49]. Grol R, Wensing M, Eccles M, Davis D. Improving patient care: The implementation of change in health care. we; 2013.
[50]. Nadler M, Bainbridge D, Tomasone J, Cheifetz O, Juergens RA, Sussman J. Oncology care provider perspectives on exercise promotion in people with cancer: an examination of knowledge, practices, barriers, and facilitators. Supportive care in cancer: official journal of the Multinational Association of Supportive Care in Cancer 2017;25:2297–2304.
[51]. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci 2009;4:50.
[52]. Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: a conceptual framework. Qual Health Care 1998;7:149–159.
[53]. Harvey G, Kitson A. PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci 2016;11:33.
[54]. Flottorp SA, Oxman AD, Krause J, et al. A checklist for identifying determinants of practice: a systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implement Sci 2013;8:35.
[55]. Tian L, Yang Y, Sui W, et al. Implementation of evidence into practice for cancer-related fatigue management of hospitalized adult patients using the PARIHS framework. PloS One 2017;12:e0187257.
[56]. CFIR Research Team. Consolidated Framework for Implementation Research: Available at: 2019. Accessed June 15, 2019.
[57]. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health 1999;89:1322–1327.
[58]. Gaglio B, Shoup JA, Glasgow RE. The RE-AIM framework: a systematic review of use over time. Am J Public Health 2013;103:e38–e46.
[59]. Proctor E, Silmere H, Raghavan R, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health 2011;38:65–76.
[60]. Eakin EG, Hayes SC, Haas MR, et al. Healthy living after cancer: a dissemination and implementation study evaluating a telephone-delivered healthy lifestyle program for cancer survivors. BMC Cancer 2015;15:992.
[61]. Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ 2008;337:a1655.
[62]. Birken SA, Rohweder CL, Powell BJ, et al. T-CaST: an implementation theory comparison and selection tool. Implement Sci 2018;13:143.
[63]. Landsverk J, Brown CH, Smith JD. Brownson RC, Colditz GA, Proctor EK, et al. Design and analysis in disssemination and implementation research. Dissemination and Implementation Research in Health New York: Oxford; 2017.
[64]. Sanson-Fisher RW, D’Este CA, Carey ML, Noble N, Paul CL. Evaluation of systems-oriented public health interventions: alternative research designs. Annu Rev Public Health 2014;35:9–27.
[65]. Green CA, Duan N, Gibbons RD, Hoagwood KE, Palinkas LA, Wisdom JP. Approaches to mixed methods dissemination and implementation research: methods, strengths, caveats, and opportunities. Adm Policy Ment Health 2015;42:508–523.
[66]. Brown CH, Curran G, Palinkas LA, et al. An overview of research and evaluation designs for dissemination and implementation. Ann Rev Public Health 2017;38:1–22.
[67]. Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci 2012;7:50–150.
[68]. Powell BJ, Waltz TJ, Chinman MJ, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci 2015;10:21.
[69]. Effective Practice and Organisation of Care. Effective Practice and Organisation of Care (EPOC) Taxonomy. Available at: 2015.
[70]. Pinnock H, Barwick M, Carpenter CR, et al. Standards for Reporting Implementation Studies (StaRI) Statement. BMJ 2017;356:i6795.
[71]. Pinnock H, Barwick M, Carpenter CR, et al. Standards for Reporting Implementation Studies (StaRI): explanation and elaboration document. BMJ Open 2017;7:e013318–e113318.
[72]. McCarter K, Britton B, Baker AL, et al. Interventions to improve screening and appropriate referral of patients with cancer for psychosocial distress: systematic review. BMJ Open 2018;8:
[73]. Butow P, Shaw J, Shepherd HL, et al. Comparison of implementation strategies to influence adherence to the clinical pathway for screening, assessment and management of anxiety and depression in adult cancer patients (ADAPT CP): study protocol of a cluster randomised controlled trial. BMC Cancer 2018;18:1077.
[74]. Loscalzo M, Clark KL, Holland J. Successful strategies for implementing biopsychosocial screening. Psychooncology 2011;20:455–462.
[75]. Lewis CC, Fischer S, Weiner BJ, Stanick C, Kim M, Martinez RG. Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria. Implement Sci 2015;10:1–17.
[76]. Mitchell SA, Chambers DA. Leveraging implementation science to improve cancer care delivery and patient outcomes. J Oncol Pract 2017;13:523–529.
[77]. Stokols D, Misra S, Moser RP, Hall KL, Taylor BK. The ecology of team science: understanding contextual influences on transdisciplinary collaboration. Am J Prev Med 2008;35 (2 suppl):S96–115.
[78]. Hall KL, Feng AX, Moser RP, Stokols D, Taylor BK. Moving the science of team science forward: collaboration and creativity. Am J Prevent Med 2008;35 (2 suppl):S243–S249.
[79]. van der Donk LJ, Tovote KA, Links TP, et al. Reasons for low uptake of a psychological intervention offered to cancer survivors with elevated depressive symptoms. Psychooncology 2019;28:830–838.
[80]. Minkler M, Salvatore AL. Brownson RC, Colditz GA, Proctor EK. Participatory approaches for study design and analysis in dissemination and implementation research. Dissemination and Implementation Research in Health 2nd ed.New York: Oxford; 2017.
[81]. Boaz A, Hanney S, Borst R, O'Shea A, Kok M. How to engage stakeholders in research: design principles to support improvement. Health Res Policy Syst 2018;16:60.
[82]. Drahota A, Meza RD, Brikho B, et al. Community-academic partnerships: a systematic review of the state of the literature and recommendations for future research. The Milbank Quarterly 2016;94:163–214.
[83]. Deverka PA, Lavallee DC, Desai PJ, et al. Stakeholder participation in comparative effectiveness research: defining a framework for effective engagement. J Comp Eff Res 2012;1:181–194.
[84]. Concannon TW, Meissner P, Grunbaum JA, et al. A new taxonomy for stakeholder engagement in patient-centered outcomes research. J Gen Intern Med 2012;27:985–991.
[85]. Lobb R, Ramanadhan S, Murray L. Brownson RC, Colditz GA, Proctor EK. Dissemination and implementation research in a global context. Dissemination and Implementation Research in Health 2nd edNew York: Oxford; 2017.
[86]. Ugalde A, Gaskin CJ, Rankin NM, et al. A systematic review of cancer caregiver interventions: appraising the potential for implementation of evidence into practice. Psychooncology 2019;28:687–701.
[87]. Keesing S, McNamara B, Rosenwax L. Cancer survivors’ experiences of using survivorship care plans: a systematic review of qualitative studies. J Cancer Surviv 2015;9:260–268.
[88]. Kerner J, Tajima K, Yip CH, et al. Knowledge exchange--translating research into practice and policy. Asian Pac J Cancer Prevent 2012;13 (4 suppl):37–48.
[89]. Weiner B. A theory of organizational readiness for change. Implement Sci 2009;4:67.
[90]. Shea CM, Jacobs SR, Esserman DA, Bruce K, Weiner BJ. Organizational readiness for implementing change: a psychometric assessment of a new measure. Implement Sci 2014;9:7–17.
[91]. Morris ZS, Wooding S, Grant J. The answer is 17 years, what is the question: understanding time lags in translational research. J Royal Soc Med 2011;104:510–520.
[92]. World Health Organisation. Global status report on noncommunicable diseases. Geneva: World Health Organization; 2014.
[93]. Colditz GA, Wolin KY, Gelhert S. Applying what we know to accelerate cancer prevention. Sci Transl Med 2012;4:127rv4.
[94]. Sharpe L, Curran L, Butow P, Thewes B. Fear of cancer recurrence and death anxiety. Psychooncology 2018;27:2559–2565.
[95]. National Cancer Institute. Research-Tested Intervention Programs (RTIPS) database. Last accessed May 31, 2019. Published 2019. Accessed.
[96]. Indig D, Lee K, Grunseit A, Milat A, Bauman A. Pathways for scaling up public health interventions. BMC Public Health 2017;18:68.
[97]. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care 2012;50:217–226.
[98]. Bernet AC, Willens DE, Bauer MS. Effectiveness-implementation hybrid designs: implications for quality improvement science. Implement Sci 2013;8:S2.
[99]. Johnson JE, Miller TR, Stout RL, et al. Study protocol: Hybrid Type I cost-effectiveness and implementation study of interpersonal psychotherapy (IPT) for men and women prisoners with major depression. Contemp Clin Trials 2016;47:266–274.
[100]. Cully JA, Armento MEA, Mott J, et al. Brief cognitive behavioral therapy in primary care: a hybrid type 2 patient-randomized effectiveness-implementation design. Implement Sci 2012;7:64.
[101]. Simmons MM, Gabrielian S, Byrne T, et al. A Hybrid III stepped wedge cluster randomized trial testing an implementation strategy to facilitate the use of an evidence-based practice in VA Homeless Primary Care Treatment Programs. Implement Sci 2017;12:46.
[102]. Roy-Byrne P, Craske MG, Sullivan G, et al. Delivery of evidence-based treatment for multiple anxiety disorders in primary care: a randomized controlled trial. JAMA 2010;303:1921–1928.
[103]. Cully JA, Stanley MA, Petersen NJ, et al. Delivery of Brief Cognitive Behavioral Therapy for Medically Ill Patients in Primary Care: A Pragmatic Randomized Clinical Trial. J Gen Intern Med 2017;32:1014–1024.
[104]. Damschroder LJ, Reardon CM, AuYoung M, et al. Implementation findings from a hybrid III implementation-effectiveness trial of the Diabetes Prevention Program (DPP) in the Veterans Health Administration (VHA). Implement Sci 2017;12:94.
[105]. Joyce C, Schneider M, Stevans JM, Beneciuk JM. Improving physical therapy pain care, quality, and cost through effectiveness-implementation research. Phys Ther 2018;98:447–456.
[106]. Aarons GA, Sklar M, Mustanski B, Benbow N, Brown CH. “Scaling-out” evidence-based interventions to new populations or new health care delivery systems. Implement Sci 2017;12:111.
[107]. Simon MA, O’Brian CA, Kanoon JM, et al. Leveraging an implementation science framework to adapt and scale a patient navigator intervention to improve mammography screening outreach in a new community. J Cancer Educ 2019.
[108]. Neta G, Sanchez MA, Chambers DA, et al. Implementation science in cancer prevention and control: a decade of grant funding by the National Cancer Institute and future directions. Implement Sci 2015;10:4–14.
[109]. Heller C, de Melo-Martin I. Clinical and Translational Science Awards: can they increase the efficiency and speed of clinical and translational research? Acad Med 2009;84:424–432.
[110]. McLean R, Tucker J. Evaluation of CIHR's Knowledge Translation Funding Program. Ottawa, Ontario: Canadian Institutes of Health Research; 2011.
[111]. Maruthappu M, Head MG, Zhou CD, et al. Investments in cancer research awarded to UK institutions and the global burden of cancer 2000–2013: a systematic analysis. BMJ Open 2017;7:e013936.
[112]. Comeau DL, Escoffery C, Freedman A, Ziegler TR, Blumberg HM. Improving clinical and translational research training: a qualitative evaluation of the Atlanta Clinical and Translational Science Institute KL2-mentored research scholars program. J Invest Med 2017;65:23–31.
[113]. Fudge N, Sadler E, Fisher HR, Maher J, Wolfe CD, McKevitt C. Optimising translational research opportunities: a systematic review and narrative synthesis of basic and clinician scientists’ perspectives of factors which enable or hinder translational research. PLoS One 2016;11:e0160475.
[114]. Jansen F, van Zwieten V, Coupé VMH, Leemans CR, Verdonck-de Leeuw IM. A review on cost-effectiveness and cost-utility of psychosocial care in cancer patients. Asia Pac J Oncol Nurs 2016;3:125–136.
[115]. Dieng M, Cust AE, Kasparian NA, Mann GJ, Morton RL. Economic evaluations of psychosocial interventions in cancer: a systematic review. Psychooncology 2016;25:1380–1392.
[116]. Gordon LG, Beesley VL, Scuffham PA. Evidence on the economic value of psychosocial interventions to alleviate anxiety and depression among cancer survivors: a systematic review. Asia Pac J Clin Oncol 2011;7:96–105.
[117]. Carlson LE, Bultz BD. Efficacy and medical cost offset of psychosocial interventions in cancer care: making the case for economic analyses. Psychooncology 2004;13:837–849. discussion 850–836.

Implementation science; Psycho-oncology; Service delivery

Copyright © 2019 The Authors. Published by Wolters Kluwer Health Inc., on behalf of the International Psycho-Oncology Society.