The updated Joanna Briggs Institute Model of Evidence-Based Healthcare : JBI Evidence Implementation

Secondary Logo

Journal Logo

DISCUSSION PAPER

The updated Joanna Briggs Institute Model of Evidence-Based Healthcare

Jordan, Zoe PhD; Lockwood, Craig PhD; Munn, Zachary PhD; Aromataris, Edoardo PhD

Author Information
International Journal of Evidence-Based Healthcare 17(1):p 58-71, March 2019. | DOI: 10.1097/XEB.0000000000000155
  • Open

Abstract

Background: 

The Joanna Briggs Institute Model for Evidence-Based Healthcare was first conceptualized in 2005. This developmental framework for evidence-based practice situated healthcare evidence, in its broadest sense, and its role and use within complex healthcare settings. The Model was recently reviewed with a view to understanding its utility by health professionals, researchers and policy makers, and the analysis revealed a need to reconsider the composition and language of the Model to ensure its currency on the international stage.

Main body: 

The current article proposes a revised Joanna Briggs Institute Model for consideration by the international community. It seeks to clarify the conceptual integration of evidence generation, synthesis, transfer and implementation, linking how these occur with the necessarily challenging dynamics that contribute to whether translation of evidence into policy and practice is successful. It also accounts for the role of different types of evidence, both research and text and opinion, and how evidence contributes to achieving improved health outcomes globally. In addition, it is centered on the importance of accounting for evidence of feasibility, appropriateness, meaningfulness and effectiveness.

Conclusion: 

The Model has been an important part of the Institute's development, both from a scientific and organizational perspective. Given the changing international discourse relating to evidence and its translation into policy and practice over the course of the last decade, it was opportune to revisit the Model and assess its ongoing applicability in its current form. Some alterations have been made for consideration in the hope that the Model reflects the Institute's current conceptualization of evidence-based healthcare (EBHC) and to increase its relevance and use pragmatically.

Background

The Joanna Briggs Institute (JBI) Model of Evidence-Based Healthcare (referred to hereafter as ‘the Model’) was first published in 2005 and since then it has been referenced widely in the literature.1 This developmental framework for evidence-based practice situated healthcare evidence, in its broadest sense, and its role and use within complex healthcare settings. It conceptualized evidence-based practice as ‘clinical decision-making that considers the best available evidence; the context in which the care is delivered, client preference and the professional judgment of the health professional’.2 In addition, the model has become an important marker of the Institute's unique and distinctive conceptualization of evidence-based healthcare (EBHC) and how it is operationalized (Fig. 1).

F1-8
Figure 1:
Original Joanna Briggs Institute Model of Evidence-Based Healthcare. This is the original Joanna Briggs Institute Model of Evidence-Based Healthcare, developed in 2005. It is a developmental framework for evidence-based practice-situated healthcare evidence, in its broadest sense, and its role and use within complex healthcare settings.

The Model further depicts four major components of EBHC as being evidence generation, evidence synthesis, evidence transfer and evidence utilization, with each modeled to incorporate their essential elements. In 2015, a working group at the JBI updated the model based on the results of a citation analysis and process of stakeholder engagement.1 The purpose of this article is to describe in detail the new JBI model and describe the rationale for the major changes.

The updated Joanna Briggs Institute Model of Evidence-Based Healthcare

The following describes each segment of the updated JBI Model of Evidence-Based Healthcare and its component parts (Fig. 2).

F2-8
Figure 2:
New Joanna Briggs Institute Model of Evidence-Based Healthcare. The figure represents the proposed new Joanna Briggs Institute Model of Evidence-Based Healthcare. The ‘inner segments’ provide the Institute's conceptualization of the major steps involved in the process of achieving an evidence-based approach to clinical decision-making, whereas the ‘outer segments’ operationalize the component parts of the model and articulate how they might be actioned in a pragmatic way.

Structure and design notes

It is important to acknowledge the corporate investment in this Model. Having been in circulation for ten years, the Model is now intrinsically broadly associated with JBI and thus the intent was not to dilute that, but rather to enhance and strengthen it. Understanding that the Model has become a fundamental framework for how JBI and the Joanna Briggs Collaboration (JBC) are organized and operate and whilst some minor structural and design alterations have been made, the integrity of the original model has been kept in mind. Changes to the structure and design of the model include color changes, the consistency of outside wedges and the introduction of arrows indicating a bidirectional flow throughout the model. This is an important clarification, in which high-quality evidence exists for a topic, there is no need for additional research (evidence generation) therefore starting with synthesis offers the most direct route to informing best practice. In the absence of high-quality research, further empirical study is required. In many practice domains, there is a need for sustained educational interventions to promote evidence transfer, and this may equally link to implementation, thus the model is not a linear approach, but is a highly flexible, context friendly approach to modeling EBHC.

Color theory provides a logical structure for color. The reconceptualization of the Model was utilized as an opportunity to create color harmony within the structure of the Model and the wedges now follow the colors of the visible spectrum in the correct sequence, which helps to create a sense of order, balance and a logical structure to the visual experience. From a design perspective, the human brain rejects what it cannot organize and thus having color harmony ensures that the information being delivered is easily processed. In addition, the outer sections (three for each wedge) are a paler version of the internal wedge color so that they act to support the importance of the inner wedges.

The inner circle (pebble of knowledge) has remained largely untouched from a design perspective and the color has been maintained in line with the pebble that sits within the JBI logo, representing the ‘pebble of knowledge’ and its ripples spreading.3 Red is used for the central ‘pebble’ circle and therefore the wedge that is now called evidence implementation has been changed from red to orange so as not to imply a stronger relationship between these two elements than the others. The ‘inner segments’ provide the Institute's conceptualization of the major steps involved in the process of achieving an evidence-based approach to clinical decision-making, whereas the ‘outer segments’ operationalize the component parts of the model and articulate how they might be actioned in a pragmatic way.

The flow, indicated by the arrows, has the large arrow flowing clockwise; the smaller arrows for the ‘feedback cycle’ are slightly smaller. It was important to acknowledge that EBHC is not a clean, linear process and users will engage with the model from the starting point that best meets their needs. The starting point for a user may be at global health, or synthesis, or transfer, or in the right context may also be at the point of implementation. The arrows are not the same size to ensure that there was directional clarity, and to avoid the appearance of being prescriptive. Making the arrows the same size would have implied there was some confusion regarding the preferred direction.

The ‘pebble of knowledge’

The central component of the JBI Model (the ‘pebble’ – aka the ‘pebble of knowledge’ as per the JBI logo) is designed to depict the Institute's conceptualization of EBHC. In the original model evidence-based practice was defined as a process of clinical decision-making that ‘considers the best available evidence; the context in which the care is delivered; client preference; and the professional judgment of the health professional (Fig. 3)’.2

F3-8
Figure 3:
The ‘pebble of knowledge’. This figure represents the central component of the new Joanna Briggs Institute Model (the ‘pebble’ – aka the ‘pebble of knowledge’ as per the Joanna Briggs Institute logo) and is designed to depict the Institute's conceptualization of EBHC as it relates to the feasibility, appropriateness, meaningfulness and effectiveness of health policy and practice.

Pearson et al.2 define evidence as ‘the basis of belief; the substantiation or confirmation that is needed to believe that something is true’. For health professionals to be able to establish the utility of a broad range of interventions and procedures a broad conceptualization of evidence is required. Although evidence of effectiveness is acknowledged as being of value, other types of evidence are considered equally important as they are designed to answer different clinical questions.

When making clinical decisions, health professionals are concerned with whether their approach is Feasible, Appropriate, Meaningful and Effective (also known as the FAME Framework). As the inception of the JBI, there has been a focus on ensuring that health professionals have access to information that addresses the different types of questions that may arise in clinical practice. This unique articulation of what constitutes evidence for decision-making was a first in the field at the time of the publication of the original model in 2005. The FAME Framework and this broader conceptualization of evidence is frequently cited and clearly resonates with those seeking to conduct research that is relevant to point of care decision-making.

The center of the new Model demonstrates that, encompassing:

  1. Feasibility (the extent to which an activity or intervention is practical or viable in a context or situation – including cost-effectiveness).
  2. Appropriateness (the extent to which an intervention or activity fits with a context or situation).
  3. Meaningfulness (refers to how an intervention or activity is experienced by an individual or group and the meanings they ascribe to that experience).
  4. Effectiveness (the extent to which an intervention achieves the intended result or outcome).

As such, we define evidence-based healthcare as clinical decision-making that considers the feasibility, appropriateness, meaningfulness and effectiveness of healthcare practices. The feasibility, appropriateness, meaningfulness and effectiveness of healthcare practices may be informed by the best available evidence, the context in which the care is delivered, the individual patient, and the professional judgment and expertise of the health professional.

Global health

In the original article, this sector of the model was similarly referred to as ‘global health’; however, it was not explicitly addressed. However, the article did state: ‘the achievement of improved global health is conceptualized as both the goal and endpoint of any or all of the model components and the raison d’etre and driver of evidence-based healthcare’.2 This assertion remains an important element of our conceptualization of evidence-based practice and hence this wedge of the Model has been moved to the top/center. Following due consideration of the available literature relating to the term Global Health and its use as well as consultation with international partners the term has been maintained, but better defined for the purpose of the Model (Fig. 4).

F4-8
Figure 4:
Global Health. This figure represents the Global Health segment of the new Joanna Briggs Institute Model and the three component parts are considered as ‘sustainable impact’, ‘engagement’ and ‘knowledge need’.

With the advent of globalization, terms such as international health, public health and global health are everywhere and although global health seems to be emerging as the preferred term there is still some debate regarding its appropriateness. In 2010, Bozorgmehr4 took a dialectic approach to defining the ‘global’ in global health, stating that the term global can be understood in different ways and arguing that the term lacks specificity and can be misleading or produce redundancy with other health-related fields or disciplines.

Given that there are often normative objectives attached a priori by various individuals and groups to the term global health, it was deemed necessary to evolve the language in this sector of the model to avoid ambiguity. As such the term is defined here as collaborative transnational research and action that places priority on improving health and achieving health equity for all people worldwide. This definition is based on the works of Koplan5 and Beaglehole and Bonita.6 We undertake that this conceptualization of global health more clearly articulates the position of JBI and the movement of global knowledge into local practice. Of course, when using the model as a heuristic for evidence-based healthcare locally, health professionals can essentially substitute the term global health for the name of their local organization. The three component parts of this wedge of the Model now include sustainable impact, engagement and knowledge need.

Sustainable impact

Often evidence implementation activities succeed in making a change to healthcare practices. On the contrary, due to resourcing issues and the ever-changing nature of health services these changes may only be temporary. To truly address and improve healthcare, any positive improvements need to be lasting. Sustainable impact can only be achieved where there is collective conceptual clarity around the motivation and perceived benefits of an evidence-informed approach to healthcare decision-making and the strategies for operationalizing it. It is likely that sustainability thresholds will be reached given the changeable nature of the healthcare environment. However, it is our belief that, if research questions are derived from the knowledge needs of the community and a collaborative approach that accounts for local application is utilized, then sustained impact is a far more likely outcome.

Engagement

To successfully address the significant issues faced in delivering evidence-informed healthcare, collaboration is essential across all involved stakeholders and groups. This ranges from local collaborations between health services and academia, to international collaboration among governments, research units and health organizations. The Institute has, since its inception, forged local and global partnerships to ensure that activities were ‘context driven by individuals and groups who understood their very specific healthcare environments and the forces that would work both for and against them’.3

Knowledge need

‘Gathering knowledge of what people need, what resources are available, and what limits constrain their choices’ is vital to an evidence-based approach to the delivery of healthcare.7 JBI has long asserted the role of evidence is to address the knowledge requirements of the community (i.e. clinicians, patients/consumers, governments and other organizations). It is these explicit questions or concerns that are also encompassed in this wedge of the JBI Model. Indeed, a significant gap associated with the translation of research into action has been the gap from knowledge need to discovery. As Pearson et al.8 suggest, ‘within this gap there can be an integrated approach to topic selection, where there is active collaboration between those conducting research and the end users of research’.

Evidence generation

It is now broadly accepted that evidence can take many forms and, in the real world of practice and policy making, decision makers are influenced by a variety of understandings and sources of evidence: habits and tradition, experience, expertise, reasoning, trial and error and research and many others.9 Evidence generation is defined as occurring through well-designed research studies grounded in any methodological position, anecdotes or opinion and experience (Fig. 5).

F5-8
Figure 5:
Evidence Generation. This figure represents the Evidence Generation segment of the new Joanna Briggs Institute Model and the three component parts are considered as ‘research’, ‘expertise’ and ‘discourse’.

Systematic reviews are as equally important as primary research in this area as they can identify important gaps in what is known about a field, intervention or practice [hence the importance of the two way (bidirectional) arrows].

The evidence generation wedge of the Model identifies, as in the original, discourse (or narrative), experience/expertise and research as legitimate means of knowledge generation. What has been removed from the wedge is the FAME scale, given its relevance across the entire Model, the shift to having it central to the conceptualization of evidence-informed healthcare and the reconceptualization of the FAME scale elements as key questions or concepts for decision-making as compared with types of evidence or systematic reviews.

Research

JBI considers that the generation of new knowledge may occur through either primary or secondary research. In the first instance, ‘the results of well-designed research studies grounded in any methodological position are seen to be more credible as evidence than anecdotes or personal opinion’.2 However, research does not always exist for every intervention, practice or procedure. In these instances, clinicians are still required to make choices about the care provided and so must look to other types of evidence/knowledge to inform their decision-making.

Expertise (and experience)

The term ‘expertise’ here can be thought of as similar to the more frequently used term ‘clinician judgment’ or even ‘clinical wisdom’. The term ‘experience’ here is synonymous with the more frequently used terms ‘patient preferences and values’. It is important to highlight that the terms ‘expertise and experience’ can refer to knowledge generated by both the clinician and/or the client/patient. Only the patient will be able to provide information regarding his or her own experience (and their expertise in everyday living). When discussing evidence generation at the microlevel, this knowledge should contribute to a process of shared decision-making (where possible).

Expertise (and experience) can be captured either within formal rigorous qualitative inquiry (such as Delphi approaches) or through discursive experiential accounts. Drawing on the conceptualizations in the healthcare wedge and the component that relates to knowledge need, JBI positions expertise/experience as able to inform primary research; secondary analysis in the form of systematic reviews (and the role of expert advisory panels); direct clinical decision-making and implementation programs. Thus expertise/experience is acknowledged as a vital form of evidence within this framework.

Discourse

Discourse can be defined as a written communication or debate based on personal anecdote or experience. The term is conceptually broad and has wide applicability across all settings. Two types of discourse, namely ‘little d’ and ‘big D’ discourse have been conceptualized in the literature.10 ‘Little d’ discourse refers to talk and text in local social interaction and ‘big D’ discourse (or Discourse) refers to culturally ‘standardized ways of referring to/constituting a certain type of phenomenon’ (p. 1134).10 This is as opposed to ‘communication, which is defined as the means by which messages are imparted, transmitted or conveyed’.11 Within the context of EBHC and the JBI Model, discourse is viewed as incorporating both big D and little d discourse and as ‘operating or taking effect through communicative functions – communications activities or tactics are the symbolic interactions through which discourses are revealed’.11

The process of identifying what type of evidence is required to answer a question and what type of evidence is available (research, expertise/experience or discourse) are fundamental to the movement of evidence into practice. Although the gold standard is still recognized by many as being the randomized controlled trial, the importance and significance of other sources of evidence continues to gather growing respect, particularly among direct care providers. Due weight must, of course, continue to be afforded to research evidence, clinical wisdom and patient preferences and values.

Evidence synthesis

The original Model defined evidence synthesis as ‘the evaluation or analysis of research evidence and opinion on a specific topic to aid in decision-making in healthcare’ and was conceptualized as having three main components (theory, methodology and systematic review of evidence).2 In this current reconceptualization, although the definition of evidence synthesis remains true and accurate, we would argue that a significant component of synthesis (i.e. collation) is missing. We also propose that a more meaningful representation of the three main pragmatic components of the wedge for JBI operations is in fact systematic reviews, evidence summaries and guidelines (Fig. 6).

F6-8
Figure 6:
Evidence Synthesis. This figure represents the Evidence Synthesis segment of the new Joanna Briggs Institute Model and the three component parts are considered as ‘guidelines’, ‘evidence summaries’ and ‘systematic reviews’.

Systematic reviews

The core of evidence synthesis efforts remains the systematic review, which is in and of itself a form of research (secondary research). Systematic review methodology is rapidly evolving with the types of reviews being conducted ranging from traditional reviews of effects to reviews of qualitative research,12 economic and cost effectiveness research,13 umbrella,14 and scoping,15 just to name a few. The scope for reviews is immense, making their applicability and relevance to practice even stronger.

Evidence summaries

However, smaller scale evidence summaries or rapid reviews have also emerged as a streamlined approach to synthesizing evidence in a timely manner.16 Although systematic reviews are still considered the gold standard in knowledge synthesis they are not without their limitations. As Khangura et al.17 identify, systematic reviews typically take anywhere between 6 months and 2 years to complete and often focus on a narrow clinical question. They suggest that evidence summaries offer something new and potentially valuable to the syntheses repertoire in a way that better addresses the needs of policy makers, decision makers, stakeholders and other knowledge users. Evidence summaries are important to EBHC when the relevance, quality and breadth of available topics meet local knowledge needs.18

Guidelines

Clinical guidelines can be defined as ‘statements that include recommendations intended to optimize patient care that are informed by a systematic review of evidence and an assessment of the benefits and harms of alternative care options’.19 Not all guidelines are not created equally, there is an implied need for careful critique of a guideline and the evidence it is based upon before it should be considered as a suitable tool for knowledge transfer. Characteristics of ‘trustworthy’ guidelines include the use of a rigorous development methodology, clear reporting of recommendations linked to the evidence, include systematic reviews in their development, and are conducted using a transparent process including extensive external review.19

Trustworthy clinical guidelines, following a rigorous methodology and considering the evidence, the balance between benefit and harms, patient values and preferences and resources are the ideal source of recommendations for practice. We would also suggest that in the ideal situation, clinical guidelines would include in their development not only systematic reviews on the effectiveness of interventions for certain conditions, but also the feasibility, appropriateness and meaningfulness of healthcare practices.

The shift of guidelines from transfer, as in the original Model, to generation and synthesis, has been made due to the fact that when developed appropriately, guidelines integrate not only the results of systematic reviews but also clinical expertise and patient experiences.

Evidence transfer

Evidence transfer is defined as ‘the act of transferring knowledge to individual health professionals, health facilities and health systems globally by means of journals, other publications, electronic media, education and training and decision support systems’.2 However, we take the position that the production of additional ‘derivative products’ from systematic reviews should not be a passive activity and as such we have redefined the term transfer to mean a coactive, participatory process to advance access to and uptake of evidence in local contexts. In adjusting this definition, we intend to reframe transfer as a potential causal phenomenon (i.e. a factor that enables, facilitates and supports evidence implementation). In this way, it moves transfer beyond a single interaction to one that extends beyond that of a ‘publication’ (Fig. 7).

F7-8
Figure 7:
Evidence Transfer. This figure represents the Evidence Transfer segment of the new Joanna Briggs Institute Model and the three component parts are considered as ‘education’, ‘systems integration’ and ‘active dissemination’.

In the original model, the component parts included systems, information and education. In this iteration, we propose that evidence transfer incorporates active dissemination, education and clinical integration. The fundamental components of this process have been articulated as being:

  1. Development of understandable and actionable messages;
  2. Accommodation of the context of the target audience's information needs; and
  3. Delivery of messages in cost effective ways (including information technology, print material, meetings, workshops and training programs).9

Active dissemination

Fundamental to the process of evidence informed decision-making is the ability of those at the point of care to access synthesized research evidence. Active dissemination (rather than passive) is therefore an important component part of this wedge of the JBI Model. This is largely a communicative function aimed at spreading knowledge/evidence on a large scale within and across geographic locations, practice settings and other networks of end users.20 As indicated in a systematic review commissioned by the Agency for Healthcare Research and Quality Effective Healthcare Program multicomponent, blended communication style dissemination strategies are more effective at enhancing clinician behavior, particularly for guideline adherence.20 Active dissemination includes active methods to spread information (email, social media), formats to encourage motivation/uptake (info-graphics, decision aids, icon arrays) and knowledge spreaders (champions, thought leaders). Passive dissemination is of course still important, but we need to be aware of its limitations.

Systems integration

Systems integration might involve the inclusion of an evidence base in clinical decision support systems, or electronic medical records or quality systems but it may also involve the embedding of evidence in broader systems, policies and procedures.

Education

Equally, educational programs have been identified as consistently effective strategies for evidence transfer. This might include education regarding the evidence related to an intervention or practice, it could involve continuing professional development or broader programs at award and nonaward levels that take participants through the rationale for evidence informed approaches to clinical decision-making, methods for evidence synthesis or pragmatic strategies for implementation.

Evidence implementation

The first and most obvious change to this component of the model is the change from utilization to implementation. The phrases implementation and utilization have both been commonly referred to in the extant literature; however, implementation seemingly better reflects this activity within the context of the JBI Model. Evidence implementation in the context of the JBI Model is defined as a purposeful and enabling set of activities designed to engage key stakeholders with research evidence to inform decision-making and generate sustained improvement in the quality of healthcare delivery (Fig. 8).

F8-8
Figure 8:
Evidence Implementation. This figure represents the Evidence Implementation segment of the new Joanna Briggs Institute Model and the three component parts are considered as ‘context analysis’, ‘facilitation of change’ and ‘evaluation of process and outcome’.

Within this wedge, the original Model incorporated the components of embedding system organizational change, practice change and the evaluation of impact on system/process/outcome. However, the field of EBHC has moved beyond a view that integrating evidence into clinical decision support systems equates to achieving implementation; much implementation science research now recognizes a need for localized situational analysis, solution building processes and sustainable implementation and evaluation.21 We propose that a more appropriate reflection of the components of evidence implementation includes a situational analysis, the facilitation of practice change and evaluation of process and outcome.

The domains of activity for evidence implementation based upon local planning, faciliatory activities and evaluation and sustainability have been extensively studied and are well supported empirically and theoretically.22,23

Context analysis

A context analysis is diagnostic in nature in which the purpose is both to understand issues within their local context that are important to practice change, and to identify factors likely to influence the proposed change. Understanding change, or creating the case for change requires data collection, working with stakeholders and gathering informed support for a change process to occur. The choice of dissemination and implementation interventions should be guided by the diagnostic analysis and informed by knowledge of relevant research.21

Facilitation of change

Practice change, whether multisite, organizational or at the level of single wards or units requires sustained facilitation. Facilitation is a skilled approach to enabling others, engaging with stakeholders and using leadership skills to enable both day-to-day practice change requirements as well as address potential organizational barriers. The strategies, leadership framework and skill set associated with facilitation is increasingly considered transferable between projects, places and people, hence experience in practice change enhances and contributes to organizational knowledge.23 More specifically, evaluation processes that identify barriers and facilitators to change are considered integral to practice change, the methods of which may include assessment of the practice change, engaging with individuals and stakeholders, evaluating the local practice setting and organizational characteristics and culture.24

Evaluation of process and outcome

Any systematic approach to changing professional practice should include plans to monitor and evaluate, and to maintain and reinforce any change. The basic principles of the Donnabedian model have been expanded upon, but practice change in the JBI conceptual model includes a focus on the structures, processes and outcomes related to global healthcare delivery.

Due to the significant influence that facilitation has on the effectiveness and sustainability of practice change, we argue that local champions, opinions leaders or clinicians (whether through audit and feedback or other programs) are essential for successful implementation of evidence. Success of implementation is dependent on how well the change process accounts for the complex, multidimensional nature of the healthcare environment – systems thinking.25 Discovering better ways to ensure patients receive the care they need is not easy and poses formidable methodological challenges. The overlap with the quality improvement field and its parent field of complexity theory are considerable.

Drawing on existing models and theories about change management and knowledge translation, the evidence implementation wedge of the JBI Model seeks to ensure that this process is one that is cognizant of local culture and context, that builds capacity and supports and reinforces existing infrastructure in a sustainable fashion.

Overarching principles: culture, capacity, communication and collaboration

The complex and inimitable healthcare environment means that there is no single, linear approach that will work every time to move evidence into policy and practice. Indeed, recommendations will not always be feasible, appropriate, meaningful or effective in a given context. As such, we propose that the overarching principles of this process are culture, capacity, communication and collaboration. In this way, issues relating to stakeholder engagement, the localization of knowledge, responsiveness to local knowledge requirements, shared decision-making and sustainability are acknowledged.

Invariably, discourse and communication are fundamental to the translational agenda. However, communication is something that has been argued as only implied in the JBI Model, rather than being explicit.26 Given the recognition that evidence translation, generally, is a largely discursive activity that takes place within a global context, a transparent and flexible approach is advocated that utilizes a broad array of communicative activities to promote collective understanding, identity and mutually beneficial goals and objectives.

It is important that this Model is not seen to be reductive in character or to ignore, in any way, the importance of social, cultural and historical organizational and individual influences on clinical decision-making. There must be understanding of the ‘sameness’ and ‘uniqueness’ of the actors participating in the process of moving evidence into policy and practice and the need, to a certain extent, to openly acknowledge it as an on-going, organic, evolutionary process that requires constructive, coactive partnership across sectors, groups and individuals.

Discussion

Following a comprehensive review process the JBI Model of Evidence-Based Healthcare underwent a series of modifications, some major and some minor. The aim of that review process was to ensure that the Model continued to accurately reflect the evidence-based movement and to articulate the associated steps in a more pragmatic way.

In an earlier draft of the updated model, the term ‘evidence-informed’ was used in preference to ‘evidence-based’. In recent years, there has been some debate in the literature over whether the term evidence-based or evidence-informed is more appropriate. Some authors have ardently defended the term evidence-based practice, arguing that a change in terminology will weaken the movement and detract from the ‘science’ of delivering healthcare.27 Others have referred to the concept of evidence-based practice as being ‘paradigmatic imperialism’, making assumptions about the ‘fit’ of the broad range of treatments/therapies within this framework.28

It is interesting to examine the etymology of the two terms ‘informed’ and ‘based’. The word inform, from Middle English, means to give form or shape to and the term based, from the old French, means bottom or foundation. Within the framework of evidence and healthcare either term could rationally work and would equally make sense. However, it is possible that the reason that evidence translation remains such a challenge is that the use of the term evidence-based has persisted and the connotations that health professionals associate with the term (i.e. that a ‘cook book’ approach to the delivery of healthcare often associated with ‘evidence-based practice’ may have continued to make health professionals resistant to it) and that an organic evolution to use of the term evidence-informed acknowledges the expertise of health professionals and the very important and active role they play in this process.

There is more to clinical decision-making than evidence alone. Given that JBI has conceptualized EBHC as clinical decision-making that considers the best available evidence, patient preference, context and clinical judgment, it could be argued that it is more appropriate to change this to evidence-informed as evidence forms only one part of the process. Part of the issue with the inference behind evidence-based is that there is a need to contextualize evidence for it to be effectively implemented into policy and practice. There are clearly a variety of factors, inputs and relationships that impact on evidence translation.

Although it seems that today the terms are utilized interchangeably, we believe that evidence-based remains a true reflection of where the movement has evolved to and we believe that health professionals, collective understandings have come sufficiently far to overcome historical barriers to use of the term. It remains in line with other current international organizations and can be easily and meaningfully translated into other languages.

Conclusion

The Model has been an important part of the Institute's development, both from a scientific and organizational perspective. It has provided a framework for the Institute's academic endeavors as much as an organizational construct for operations at both a local and international level. Given the changing international discourse relating to evidence and its translation into policy and practice over the course of the last decade it was opportune to revisit the Model and assess its ongoing applicability in its current form. Some changes/alterations have been made for consideration in the hope that in the model reflects the Institute's current conceptualization of EBHC and to increase its relevance and use pragmatically.

Acknowledgements

The authors would like to thank members of the Joanna Briggs Collaboration Committee of Directors for providing their thoughts and feedback in relation to the JBI Model of Evidence-Based Healthcare.

Authors’ contributions: Z.J. drafted the first full version of the article, C.L., Z.M. and E.A. provided feedback and comment on the article and all authors provided approval of the final version to be published.

Ethics approval and consent to participate: not applicable.

Consent for publication: not applicable.

Availability of data and material: not applicable.

Conflicts of interest

The authors report no conflicts of interest.

References

1. Jordan Z, Lockwood C, Munn Z, Aromataris E. Redeveloping the JBI Model of Evidence-Based Healthcare: citation analysis and stakeholder engagement. Int J Evid Based Healthc 2018; 16:227–241.
2. Pearson A, Wiechula R, Court A, Lockwood C. The JBI Model of Evidence-based Healthcare. Int J Evid Based Healthc 2005; 3:207–215.
3. Jordan Z, Donnelly P, Pittman E. A short history of a big idea: The Joanna Briggs Institute 1996–2006. Melbourne, Australia: Ausmed Publications; 2006.
4. Bozorgmehr K. Rethinking the ‘global’ in global health: a dialectic approach. Glob Health 2010; 6:19.
5. Koplan JP, Bond TC, Merson MH, et al. Towards a common definition of global health. Lancet 2009; 373:1993–1995.
6. Beaglehole R, Bonita R. What is global health? Glob Health Action 2010; 3: doi: 10.3402/gha.v3i0.5142.
7. Jordan Z, Pearson A. International collaboration in translational health science. Philadelphia, PA: Lippincott, Williams and Wilkins; 2013.
8. Pearson A, Jordan Z, Munn Z. Translational science and evidence-based healthcare: a clarification and reconceptualization of how knowledge is generated and used in healthcare. Nurs Res Pract 2012; 2012:792519.
9. Pearson A, Weeks S, Stern C. Translation science and the JBI Model of Evidence Based Healthcare. Philadelphia, PA: Lippincott, Williams and Wilkins; 2011.
10. Alvesson M, Karreman D. Varieties of discourse: on the study of organizations through discourse analysis. Hum Relat 2000; 53:1125–1149.
11. Jordan Z. International collaboration in health sciences research: manna, myth and model (Doctoral thesis). South Australia: University of Adelaide; 2011.
12. Lockwood C, Munn Z, Porritt K. Qualitative research synthesis: methodological guidance for systematic reviewers utilizing meta-aggregation. Int J Evid Based Healthc 2015; 13:179–187.
13. Gomersall J, Jadotte Y, Xue Y. Conducting systematic reviews of economic evaluations. Int J Evid Based Healthc 2015; 13:170–178.
14. Aromataris E, Fernandez R, Godfrey C. Summarizing systematic reviews: methodological development, conduct and reporting of an umbrella review approach. Inter J Evid Based Healthc 2015; 13:132–140.
15. Peters M, Godfrey C, Khalil H. Guidance for conducting systematic scoping reviews. Int J Evid Based Healthc 2015; 13:141–146.
16. Munn Z, Lockwood C, Moola S. The development and use of evidence summaries for point of care information systems: a streamlined rapid review approach. Worldviews Evid Based Healthc 2015; 12:131–138.
17. Khangura S, Konnyu K, Cushman R, et al. Evidence summaries: the evolution of a rapid review approach. Syst Rev 2012; 1:10.
18. Campbell J, Umapathysivam K, Xue Y, Lockwood C. Evidence-based practice point-of-care resources: a quantitative evaluation of quality, rigor, and content. Worldviews Evid Based Nurs 2015; 12:313–327.
19. Institute of Medicine of the National Academies. Clinical practice guidelines we can trust. Washington, DC: The National Academy of Sciences; 2011.
20. RTI International-University of North Carolina Evidence-based Practice Centre. Communication and dissemination strategies to facilitate the use of health related evidence. Research Triangle Park, NC: Commissioned by the Agency for Healthcare Research Quality; 2013.
21. Harrison MB, Graham ID. Roadmap for a participatory research-practice partnership to implement evidence. Worldviews Evid Based Nurs 2012; 9:210–220.
22. Graham ID, Tetroe JM, Pearson A. LWW, Turning knowledge in to action: practical guidance on how to do integrated knowledge translation research. Book 21. Philadelphia: 2014.
23. Lockwood C, Stephenson M, Lizerondo L, et al. Evidence implementation: development of an online methodology from the knowledge to action model of knowledge translation. Int J Nurs Pract 2016; 22:322–329.
24. Casttiglionie AS, Ritchie JA. Graham ID, Tetroe JM, Pearson A. Turning knowledge into action: practical guidance on how to do integrated knowledge translation research. Moving into action: we know what practices we want to change, now what? An implementation guide for healthcare practitioners 1st ed.Philadelphia: Lippincott Williams and Wilkins; 2017; pp. 98–138.
25. Graham I, Tetroe J. Some theoretical underpinnings of knowledge translation. Acad Emerg Med 2007; 14:936–941.
26. Manojlovich M, Squires JE, Davies B, Graham I. Hiding in plain sight: communication theory in implementation science. Implement Sci 2015; 10:58.
27. Melnyk BM, Newhouse R. Evidence-based practice versus evidence informed practice: a debate that could stall forward momentum in improving healthcare quality, safety, patient outcomes and costs. Worldviews Evid Based Nurs 2014; 11:347–349.
28. Bohart A. Evidence based psychotherapy means evidence-informed, not evidence-driven. J Contemp Psychother 2005; 35:39–53.
Keywords:

conceptual model; evidence-based healthcare; knowledge translation

International Journal of Evidence-Based Healthcare © 2019 The Joanna Briggs Institute

​

A video commentary on implementation project titled: How do health professionals prioritise clinical areas for implementation of evidence into practice? The commentary is provided by Andrea Rochon RN, MNSc, Research Assistant, Queen's University, Ontario, Canada