Health care improvement is essential for achieving care that is safe, high quality, value-based, and patient-centered. Health care improvement efforts, consisting of health care quality improvement (QI), patient safety, implementation, or innovation, are best when continuous and led by interprofessional health care teams. Academic health care institutions are often slow to adopt formalized processes that identify and recognize excellence in contributions to health care improvement. The reliance on peer-reviewed publications and grant funding, traditional metrics for academic advancement, contributes to the neglect of professional advancement among faculty who focus on health care improvement. 1 These traditional metrics encourage finite research projects with clear quantitative findings that favor rapid improvements and faster cycle times between acquiring grant funding and publishing in academic journals. 1,2
In contrast, faculty contributions to health care improvement require ongoing, iterative, incremental, and often institution-specific work that is frequently not recognized or used for promotion by academic health care institutions. Faculty of all health professions who conduct health care improvement initiatives need alternative methods for recognizing rigor and impact for work that often does not have extramural grant funding and involves long, multifaceted processes with no definite endpoint. Excellence in health care improvement should highlight projects that are challenging to maintain and require constant monitoring and improvement to prevent regression of results. These factors place individuals who specialize in health care QI, patient safety, implementation, and innovation at a disadvantage in terms of recognition of academic achievement for promotion and tenure applications. 1,3,4 Recognition of scholarship beyond traditional research has been suggested 5 but has yet to be widely adopted institutionally. Moreover, health care improvement does not follow a traditional research model and, therefore, new ways of recognizing faculty contributions and impacts are required in addition to conventional scholarship.
No consensus standards currently exist for documenting and acknowledging excellence in health care improvement and the requisite skills of interprofessional teamwork. 6 Formal processes, criteria, and portfolios for academic promotion exist for faculty whose mission is primarily education. 7 These educational excellence portfolios highlight an individual’s academic excellence in teaching and evaluation, development and dissemination of enduring education products, and educational leadership roles and awards. 8,9 However, interprofessional faculty who work primarily in clinical settings with other health care professionals but with few students and trainees cannot use educational portfolios to document their professional excellence for academic promotion and advancement. A previously developed QI portfolio, the first of its kind, was developed more than a decade ago, in 2009, by the Society of General Internal Medicine (SGIM) 10 for a physician audience and has had some adoption across health professions. 11 However, in the last 10 years, the need for all health care professions to contribute to interprofessional health care improvement has been well documented. 12 Novel methods for documenting achievements and excellence beyond grants, publications, and educational portfolios are needed to foster academic tenure and promotion for individuals from all professions who excel in the field of health care improvement. 1–4
To address this gap, we propose a health care improvement portfolio that captures the best features of educational excellence portfolios. A distinct health care improvement portfolio could highlight domains of academic excellence and impact in the areas of QI, patient safety, health care innovations, dissemination, and implementation applicable to interprofessional faculty for academic advancement, promotion, and tenure. In this article, we describe the development of a health care improvement portfolio using a modified Delphi consensus process with an expert panel of international, interprofessional leaders in health care improvement, patient safety, and implementation science. We then provide a case example of how the portfolio was used by one of us (M.A.M.), a physician faculty member, to achieve academic promotion.
Modified Delphi panel process overview
We used a modified Delphi consensus process technique 13,14 based on the RAND method 15 and modified for a virtual panel to develop the components of a health care improvement portfolio. The Delphi methodology is a content validation method that involves asking an expert panel a series of iterative questions. 13–15 The process was built upon previously published techniques to develop consensus virtually rather than through traditional face-to-face methods. 14,16 We adopted a structured, 3-part methodology (see Figure 1). We first conducted a literature review in fall 2017 to identify key domains to consider for inclusion in a health care improvement portfolio. Second, we drafted an initial portfolio to describe the proposed domains and methods for documenting and communicating excellence. Finally, we conducted a Delphi panel in spring 2018 to review the portfolio critically and produce a revised consensus document. We finalized the portfolio in fall 2018. The consensus process was QI in nature and did not constitute research based on Baylor College of Medicine’s institutional review board policies.
Literature review and portfolio development
We conducted a review of the literature to identify existing academic portfolios, many focused on education; to inform the structure of the portfolio; and define key domains comprising these portfolios to guide the content. Search terms included “portfolio,” “quality improvement,” and “health care improvement.” We also searched the internet for examples of educational portfolios to inform the language and structure of the health care improvement portfolio. After reviewing existing literature 2–7,11,17 and portfolios, 8–10,18–21 the development team (K.M.G., S.C., M.H., A.D.N.) drafted the health care portfolio, which comprised domains relevant to the clinical, education, and research missions of academic health care institutions. Domains were similar to domains found in educational portfolios and the original SGIM portfolio. 10,18–21 The draft portfolio, while similar to the original SGIM portfolio 10 in terms of the domains, varied from the SGIM portfolio with an emphasis on specific language to describe potential impacts within each domain. Specifically, each domain within the portfolio included a description of the domain, a place to document the date(s) of the relevant activities, criteria for documenting content relevant to each domain, and examples of demonstrating excellence and impact within the domain. The development team reviewed and edited the initial draft portfolio before presenting it to the expert panel for review through the modified Delphi consensus process.
Formation of expert panel
The expert panel consisted of 35 faculty (including M.A.M.) from across North America with expertise in health care improvement. Panel members were selected based upon their expertise in health care improvement, including improvement research, health care innovations, patient safety, QI, and health care improvement education. Given the focus on developing an interprofessional health care improvement portfolio, we took care to ensure panel members represented a multitude of health care professions including medicine (n = 25), nursing (n = 7), pharmacy (n = 1), clinical psychology (n = 1), and health services (n = 1). We recruited 16 panel members from the faculty of the VA Quality Scholars program, the longest-standing interprofessional training program for health care improvement, which spans 11 sites around the United States and an affiliated site in Toronto, Canada. Faculty from the VA Quality Scholars program were recruited because they are recognized as experts in health care improvement, are interprofessional, and lead interprofessional teams at their respective sites. We recruited other panel members by snowball sampling, in which we queried experts in health care improvement to recommend other potential panelists. We contacted all panel members by email and described membership expectations and the projected timeline for the modified Delphi consensus process. The initial panel included 35 individuals who committed to being a part of the project. The final panel included 34 participants due to the attrition of 1 individual because of scheduling conflicts.
Expert panel ratings
The modified Delphi consensus process was conducted through an online survey platform (SurveyMonkey). We emailed an invitation to all panelists with an introduction to the portfolio and instructions on how to participate in the modified Delphi process. The first round of the survey asked panelists to complete open-ended questions regarding each category and the criteria within each category of the portfolio. Each panelist received a screenshot of each section of the portfolio; we asked them which elements and criteria should be added or removed. Panelists could provide additional comments on each section of the portfolio. Open-ended questioning allowed panelists to provide broad, as well as specific, feedback on the scope and content of the portfolio and on the structure of the portfolio. Panelists had 2 weeks to complete and submit their review and feedback. Panelists’ comments were anonymous to each other but identifiable to the development team.
After all panelists submitted their responses, the 4 members of the development team reviewed the comments, then organized and aggregated responses into related areas of revision. Next, we revised the portfolio based on the aggregated comments as well as suggested structural changes and additions/revisions of criteria within each domain. The revised draft was then sent out to the panel with the second round of survey questions. For the second round, panelists were presented with a screenshot of each revised section of the portfolio and given a series of Likert-type questions to determine the comprehensiveness and clarity of each section of the portfolio. The literature on the Delphi consensus process does not specify a validated scoring system; however, it has been recommended that consensus be defined at 70% or greater. 22 We chose to define consensus as a high level of agreement on the Likert-type questions. Thus, we prospectively defined consensus as an average rating of 8 or above on each 9-point Likert-type scale. Comprehensiveness and clarity were assessed on a 9-point scale, where a rating of 1 indicated “not comprehensive” or “not clear” and a rating of 9 indicated “totally comprehensive” or “totally clear.” We asked participants for additional comments on each section of the portfolio. Finally, at the end of the survey, participants were asked to consider the full portfolio and indicate their level of agreement with the following statement: “This portfolio allows one to assess rigor across multiple dimensions of an applicant’s health care improvement work within the context of academic promotion and tenure.” Participants responded on a 9-point Likert-type scale where a rating of 1 indicated “strongly disagree” and a rating of 9 indicated “strongly agree.” Participants could leave comments about the overall portfolio before the survey ended. After all panelists submitted their second round of responses, the development team analyzed the data, reviewed the comments, and organized them into thematic concepts.
The literature review established 8 major domains for consideration that we included in the initial portfolio: personal statement; leadership and administrative activities; project activities; QI education and curricular activities; QI research and scholarship; QI honors, awards, and recognitions; QI training and certification; and supporting documents.
Two main themes emerged from the results of the first round of the consensus process. First, participants commented that “quality improvement” or “QI” terminology was limiting and should be revised to “health care improvement” to encompass impact more broadly in the areas of patient safety, quality improvement, implementation, and innovation. Second, participants indicated that there was an artificial distinction between project activities and QI research and scholarship. Additionally, participants indicated that scholarship should be embedded within both QI education and curricular activities as well as project activities. Thus, the first-round ratings refined the portfolio from the original 8 domains into 7: personal statement; health care improvement training and certification; leadership and administrative roles; health care improvement project activities; health care improvement coaching, teaching, and curricular activities; health care improvement honors, awards, and recognitions; and supplemental documents (see Table 1 for definitions).
In addition to broad feedback, participants also gave feedback to suggest additional criteria within each domain, to clarify the language of the existing criteria, and to reduce redundancy between domains. Participants noted the emphasis on interprofessional teamwork and suggested revisions to the original language to capture the construct more clearly. For example, participants suggested that the portfolio should describe the diversity of the team, not only interprofessional health care professionals but also patients and community members, as relevant. The language of the portfolio was edited to prompt users to describe the diversity of the team. These edits were included in the first round of revisions to ensure the portfolio was comprehensive and to clarify the language used in the portfolio.
We then sent the revised portfolio out to the same 34 expert consensus panelists to complete the second-round ratings. Participants achieved consensus with average scores of 8.4 in comprehensiveness and 8.3 in clarity in this round. Median scores for each section of the portfolio were between 8 (range, 4–9) and 9 (range, 1–9) on both comprehensiveness and clarity.
Overall, comments indicated support of the portfolio. The last question, “This portfolio allows one to assess rigor across multiple dimensions of an applicant’s health care improvement work within the context of academic promotion and tenure,” received a median score of 8 (range, 6–9). However, panelists suggested the utility of the portfolio was broader than academic promotion and tenure and should include individuals in health care settings outside academia. Panel members suggested inclusion of supporting documents with the portfolio. For example, “letters of support from local leaders showing impact” is one recommended supporting document. The final portfolio, relevant for those within and outside academia, includes the minor revisions received during the second-round ratings (see Figure 2 for an example section and Supplemental Digital Appendix 1, at https://links.lww.com/ACADMED/B197 for the full portfolio).
We developed this portfolio to address an unmet need for faculty to document their health care improvement excellence and impact within academic health care centers to further their professional advancement. After we finalized the portfolio, we presented 2 national workshops in which interprofessional workshop participants used the portfolio to capture and curate their health care improvement contributions. Additionally, a faculty member at Baylor College of Medicine (M.A.M.) used the portfolio to document and demonstrate his excellence in health care improvement for his promotion and tenure application. Although the portfolio was not a required document, the format and structure of the portfolio allowed him to successfully demonstrate his contributions to health care improvement, and he received promotion to associate professor with tenure. See Figure 3 for a completed section of his health care improvement portfolio and Supplemental Digital Appendix 2, at https://links.lww.com/ACADMED/B197, for a fully completed portfolio.
In addition to using the portfolio for promotion and tenure, both the expert consensus panelists and the 2 interprofessional, national workshop audiences indicated broader applicability of the portfolio. Front-line providers outside academia can use the portfolio to document and demonstrate excellence in health care improvement to facilitate their own career development. Leaders of clinical departments could use the portfolio to document their contributions to the development and impact of interprofessional teams and clinical program improvements. The portfolio can be used to document one’s work for the purposes of justifying and negotiating a salary increase, applying for a new position, facilitating annual performance review and goal setting, and making the case for additional resources. Currently, interprofessional fellows in the VA Quality Scholars program, including clinical psychologists, nurses, pharmacists, physical therapists, and physicians, use the portfolio to capture and demonstrate their health care improvement impacts for future career placement.
While we recognize the broad applicability of the portfolio, one limitation is that we did not include individuals outside academia as panelists who evaluated the portfolio. Future validation with individuals outside academia is warranted. Moreover, while the panel was interprofessional, the majority of panel members were physicians. This may have biased the portfolio to be more relevant for physicians than other health professionals. However, the portfolio has been used by physical therapists, nurses, clinical psychologists, and physicians to capture impact in health care improvement.
Using a virtual consensus panel approach, we developed a portfolio for documenting excellence in health care improvement for academic faculty. This virtual modified Delphi consensus process allowed for the creation of a conceptually robust, content-rich portfolio. The inclusion of experts in health care improvement drawn from medicine, nursing, pharmacy, clinical psychology, and health services research increases the likelihood that the portfolio will have widespread applicability across multiple professions. Our final list of 7 portfolio domains received greater than 80% agreement among the expert consensus panel.
Recognizing and promoting interprofessional faculty with expertise in health care improvement, including developing, implementing, and sustaining improvement projects, depend on a host of factors such as supportive administrative structures, sufficient resources, and available training opportunities. Yet, to our knowledge, there has been no previous portfolio developed by an interprofessional panel for interprofessional use. The existing health care improvement portfolio 10 that preceded ours was developed to represent the needs of an individual profession and may not be applicable to all health care professionals. Moreover, the portfolio we developed places a strong emphasis on the importance of interprofessional health care teams as an important criterion for demonstrating excellence in health care improvement.
The health care improvement portfolio has many advantages for professions in which a clinical doctorate is one of the terminal degrees offered, such as in nursing, pharmacy, and physical therapy. In the current tenure and promotion system of these fields, research activities outweigh other activities and contributions to health care improvement are generally not recognized. Expertise in health care improvement is often overlooked as evidence of excellence for academic promotion. Moreover, the traditional curriculum vitae structure includes sections for research activities but not health care improvement activities. The portfolio is intended to be used as an additional document for promotion to highlight important contributions to health care improvement. Thus, the health care improvement portfolio we designed has an added value for these professions, highlighting the accomplishments that result in improvements in patient outcomes, reduction in costs, and other organizational impacts.
In closing, a panel of academics and experts in health care improvement reached consensus on a conceptually robust, content-rich health care improvement portfolio to be used by interprofessional faculty for promotion and tenure. Our next steps are to develop objective criteria for each section of the portfolio to recognize excellence in health care improvement for faculty awards.
The authors would like to acknowledge the expert panelists for their contributions to the development of this health care improvement portfolio. Panelists were: Corrine Abraham, DNP, RN; David Aron, MD; Uma Ayyala, MD; Michael Bowen, MD; Jill Cawiezell, PhD, RN; Elise Dasinger, PharmD, MHA; Louise Davies, MD, MS; David Ganz, MD, PhD; Anne Gill, DrPH, MS, RN; Krysta Johnson-Martinez, MD; Peter Kaboli, MD, MS; Sei Lee, MD, MAS; Erica Lescinskas, MD; Luci Leykum, MD, MBA, MSC; Daniel Murphy, MD, MBA; Linda Norman, DSN, RN; Michael Ohl, MD, MSPH; Kevin O’Leary, MD, MS; Pat Patrician, PhD, RN; Federico Perez, MD; Laura Petersen, MD, MPH; Read Pierce, MD; Sheila Rauch, PhD; Russell Rothman, MD, MPP; Christine Rovinski, MSN, APRN; Emily Sedgwick, MD; Hardeep Singh, MD, MPH; Christine Soong, MD, MSC; Ted Speroff, PhD; Benjamin Taylor, MD; Anne Tomolo, MD, MPH; Margaret Wallhagen, RN, PhD; Brian Wong, MD, MPH; and LeChauncy Woodard, MD, MPH.
1. Staiger TO, Wong EY, Schleyer AM, Martin DP, Levinson W, Bremner WJ. The role of quality improvement and patient safety in academic promotion: Results of a survey of chairs of departments of internal medicine in North America. Am J Med. 2011;124:277–280.
2. Anderson MG, Alessandro DD, Quelle D, Axelson R, Geist LJ, Black DW. Recognizing diverse forms of scholarship in the modern medical college. Int J Med Educ. 2013;4:120–125.
3. Staiger TO, Mills LM, Wong BM, Levinson W, Bremner WJ, Schleyer AM. Recognizing quality improvement and patient safety activities in academic promotion in departments of medicine: Innovative language in promotion criteria. Am J Med. 2016;129:540–546.
4. Shojania KG, Levinson W. Clinicians in quality improvement: A new career pathway in academic medicine. JAMA. 2009;301:766–768.
5. Simpson D, Meurer L, Braza D. Meeting the scholarly project requirement-application of scholarship criteria beyond research. J Grad Med Educ. 2012;4:111–112.
6. Coleman DL, Wardrop RM 3rd, Levinson WS, Zeidel ML, Parsons PE. Strategies for developing and recognizing faculty working in quality improvement and patient safety. Acad Med. 2017;92:52–57.
7. Simpson D, Fincher RM, Hafler JP, et al. Advancing educators and education by defining the components and evidence associated with educational scholarship. Med Educ. 2007;41:1002–1009.
8. Baldwin CD, Gusic ME, Chandran L. The educator portfolio: A tool for career development. Association of American Medical Colleges. https://www.aamc.org/professional-development/affinity-groups/gfa/faculty-vitae/educator-portfolio-tool
. Accessed September 14, 2021.
9. Shinkai K, Chen CA, Schwartz BS, Loeser H, Ashe C, Irby DM. Rethinking the educator portfolio: An innovative criteria-based model. Acad Med. 2018;93:1024–1028.
10. Taylor BB, Parekh V, Estrada CA, Schleyer A, Sharpe B. Documenting quality improvement and patient safety efforts: The quality portfolio. A statement from the academic hospitalist taskforce. J Gen Intern Med. 2014;29:214–218.
11. Sehgal NL, Neeman N, King TE. Early experiences after adopting a quality improvement portfolio into the academic advancement process. Acad Med. 2017;92:78–82.
12. Clinical Learning Environment Review (CLER) Evaluation Committee CLER Pathways to Excellence: Expectations for an Optimal Clinical Learning Environment to Achieve Safe and High-Quality Patient Care, Version 2.0. 2019.Chicago, IL: Accreditation Council for Graduate Medical Education
13. Helmer-Hirschberg O. Analysis of the Future: The Delphi Method. 1967. Santa Monica, CA: RAND Corporation; https://www.rand.org/pubs/papers/P3558.html
. Accessed September 14, 2021.
14. Humphrey-Murto S, Wood TJ, Gonsalves C, Mascioli K, Varpio L. The Delphi method. Acad Med. 2020;95:168.
15. Dalkey NC. Delphi. 1967. Santa Monica, CA: RAND Corporation; https://www.rand.org/pubs/papers/P3704.html
. Accessed September 14, 2021.
16. George N, Barrett N, McPeake L, Goett R, Anderson K, Baird J. Content validation of a novel screening tool to identify emergency department patients with significant palliative care needs. Acad Emerg Med. 2015;22:823–837.
17. Boyer EL. Scholarship Reconsidered: Priorities of the Professoriate. 1990. Princeton, NJ: The Carnegie Foundation for the Advancement of Teaching; https://files.eric.ed.gov/fulltext/ED326149.pdf
. Accessed September 14, 2021.
22. Humphrey-Murto S, Varpio L, Gonsalves C, Wood TJ. Using consensus group methods such as Delphi and Nominal Group in medical education research. Med Teach. 2017;39:14–19.