Share this article on:

Why Open-Ended Survey Questions Are Unlikely to Support Rigorous Qualitative Insights

LaDonna, Kori, A., PhD; Taylor, Taryn, MD, PhD, FRCPC; Lingard, Lorelei, PhD

doi: 10.1097/ACM.0000000000002088
Invited Commentaries

Health professions education researchers are increasingly relying on a combination of quantitative and qualitative research methods to explore complex questions in the field. This important and necessary development, however, creates new methodological challenges that can affect both the rigor of the research process and the quality of the findings. One example is “qualitatively” analyzing free-text responses to survey or assessment instrument questions. In this Invited Commentary, the authors explain why analysis of such responses rarely meets the bar for rigorous qualitative research. While the authors do not discount the potential for free-text responses to enhance quantitative findings or to inspire new research questions, they caution that these responses rarely produce data rich enough to generate robust, stand-alone insights. The authors consider exemplars from health professions education research and propose strategies for treating free-text responses appropriately.

K.A. LaDonna is assistant professor, Department of Innovation in Medical Education and Department of Medicine, University of Ottawa, Ottawa, Ontario, Canada; ORCID: http://orcid.org/0000-0003-4738-0146.

T. Taylor is assistant professor, Department of Obstetrics and Gynaecology, and scientist, Centre for Education Research and Innovation, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada.

L. Lingard is professor, Department of Medicine and Faculty of Education, and founding director and senior scientist, Centre for Education Research and Innovation, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada.

Funding/Support: None reported.

Other disclosures: None reported.

Ethical approval: Reported as not applicable.

Correspondence should be addressed to Kori A. LaDonna, Department of Innovation in Medical Education and Faculty of Medicine, University of Ottawa, 850 Peter Morand Cres., Ottawa, Ontario, Canada K1G 5Z3; e-mail: kladonna@uottawa.ca; Twitter: @Kori_LaDonna.

Health professions education researchers are increasingly relying on a combination of quantitative and qualitative research methods to explore complex questions in the field. Although this development is important and necessary, it has created new methodological challenges. Researchers must consider not only the principles of rigor attendant on one approach but also the complementarity or incompatibility of multiple approaches.1 Certainly, methods can be integrated strategically to productive effect, as in the case of mixed-methods research,2 but they can also be combined blithely, with negative implications for the quality of the insights the research can provide.

One common example of combining research methods that can be problematic is the quantitative survey or measurement instrument that includes a subset of “qualitative” questions. Often this takes the form of closed-ended (Likert-type or forced-choice) items followed by a few open-ended questions or, in medical education assessment, free-text fields for narrative feedback to teachers or learners about their performance. Analysis of the free-text responses is frequently presented as “qualitative” research. In this Invited Commentary, we explain why the analysis of such responses rarely meets the bar for rigorous qualitative work.

Back to Top | Article Outline

What Is the Bar for Rigor?

The purpose of qualitative research is to understand “how people interpret their experiences, how they construct their worlds, and what meaning they attribute to their experiences.”3 To do this, qualitative researchers engage in an iterative, time-intensive process that involves multiple rounds of data coding punctuated by peer debriefing, consultation with the literature, and additional data collection either to “member check”4 or to flesh out early analytical insights.3 , 5 , 6 While there are multiple ways to assess the rigor of this process,7–10 Tracy’s eight “big tent” criteria11 shape our assumptions about quality: That is, to meet the bar for excellence, qualitative research must (1) explore a worthy topic; (2) demonstrate rigor; be (3) sincere, (4) credible, and (5) ethical; (6) resonate with an audience; (7) make a significant contribution; and (8) achieve meaningful coherence. Meeting these criteria requires that both the research question and its findings be timely and relevant, and that researchers choose procedures that not only fit the research purpose but also produce rich and appropriate data, attend to reflexivity,12 and “meaningfully interconnect literature, research questions/foci, findings, and interpretations with each other.”11

Back to Top | Article Outline

What Is the Matter With a “Qualitative” Analysis of Free-Text Responses?

Free-text responses to survey or assessment items rarely produce data rich enough either to achieve sincerity, credibility, and resonance or to make a substantial contribution.11 Data richness has been variously described as involving descriptions of the particularities of the social world6; disclosure of participants’ feelings and commonly inaccessible thoughts5; “lush” or “thick” descriptions that evoke context, emotion, and social relationships13–15; and various formats and combinations of representation such as sounds, gestures, or videos.16 In short, for data to be “rich,” they must have context, personal meaning, emotional and social nuances, and layers of detail.

The space for free-text responses on paper survey instruments tends to be a few inches; on electronic or online instruments, it is often a restricted text field. In our experience, health professions teachers, students, and practitioners do not typically provide copious narrative feedback in the allotted space. In turn, data consisting of a few sentences (or less) often lack “attention to context and … conceptual richness.”17 In this situation, the number of surveys completed is irrelevant; 500 responses of a few phrases each can constitute an appropriate sample but may not necessarily do so, particularly if the questions—and responses—are tangential add-ons to the research aims. Therefore, while analysis of free-text responses can generate preliminary understanding and help researchers begin to sketch content areas, it usually cannot get at the “how?” and “why?” questions that are the core business of qualitative research.

Additionally, free-text responses are rarely analyzed using rigorous qualitative procedures. Instead, the analysis may appear more quantitative than qualitative, particularly if the primary focus is frequency of keywords. That is not to say that counting recurring words is wrong but, rather, that it will often be insufficient. A robust qualitative analysis of free-text responses—whether it follows content,18 , 19 thematic,20 or discursive or linguistic procedures21—must do more than count. It must enrich our understanding of the social phenomena being explored.

For these reasons, we contend that responses to free-text questions will rarely meet the standard for richness required of qualitative data, and that the analysis of these responses, therefore, risks falling short of producing robust, interpretive, stand-alone insights. We caution researchers to think twice about whether these analyses are worthy of publication in their own right.

Back to Top | Article Outline

What Is the Solution?

There are, of course, exceptions. That is, valuable contributions can be made if free-text response data are “new, unique, or rare” and appropriate for answering a specific, a priori research question.11 To illustrate, consider two studies based on free-text comments in medical education assessment instruments that we think meet the bar for rigorous, stand-alone qualitative research. Myers et al22 used thematic analysis and concordance software to describe the patterns in clinical teaching assessments containing residents’ free-text comments about their clinical teachers. Among their findings was the insight that residents’ descriptions of “areas of improvement” for faculty may say more about resident learning needs than about faculty teaching behaviors. Ginsburg et al23 analyzed written comments by faculty on resident in-training evaluation reports and both described themes in the comments and explored their relationship with the CanMEDS competency framework. They discovered three recurring themes in the written comments that suggested competencies valued by faculty but not represented in the CanMEDS framework.

Importantly, in both of these examples the analysis of the free-text responses was the central focus of the study, not an add-on to a larger, quantitative project; as a consequence, these data were purposefully selected to answer the research question. Although additional data, such as interviews or participant observations, might have enhanced the authors’ findings, the free-text responses were appropriate for their inquiries. Finally, both groups of authors ensured rigor by analyzing and presenting the data in tandem with existing literature and conceptual frameworks. Therefore, although the data themselves were not “rich” as narratives, the analysis nevertheless was capable of yielding meaningful qualitative insights.

We are not suggesting that researchers should avoid open-ended survey questions, nor are we suggesting that researchers should ignore the data provided by such questions. On the contrary, survey respondents’ written responses can enhance quantitative findings, highlight problems with survey questions, corroborate answers to closed-ended questions, and inspire new avenues for research.17 And narrative responses on assessment instruments, albeit abbreviated, can provide a resource for answering important questions about the nature and meaning of written feedback in specific contexts.

However, as Silverman24 has argued, “qualitative research is not simply a set of techniques to be slotted into any given research problem.” To treat brief free-text responses appropriately, we offer three suggestions. First, in the case of a survey instrument that includes a few open-ended questions, researchers should conceptualize these data and their analysis a priori as an adjunct analysis to the primary survey research, not as a post hoc stand-alone piece of qualitative scholarship. Second, in the case of a study focused purposefully on brief responses to free-text items such as those found in many assessment instruments, researchers should ensure that the research question is focused and appropriate, and they should engage in analytical procedures that offer robust insights into the social phenomena being explored. Finally, to help ensure rigor, we suggest consulting with an experienced qualitative researcher who can both assist with study design and provide guidance as the analysis unfolds.

Back to Top | Article Outline

References

1. Varpio L, Martimianakis MA, Mylopoulos M. Cleland J, Durning SJ. Qualitative research methodologies: Embracing methodological borrowing, shifting and importing. In: Researching Medical Education. 2015:Chichester, UK: John Wiley & Sons; 245–255.
2. Creswell JW, Klassen AC, Plano Clark VL, Smith KC; Office of Behavioral and Social Sciences Research. Best Practices for Mixed Methods Research in the Health Sciences. 2011:Bethesda, MD: National Institutes of Health; 2094–2103.
3. Merriam SB, Tisdell EJ. Qualitative Research: A Guide to Design and Implementation. 2015.4th ed. New York, NY: John Wiley & Sons.
4. Lincoln YS, Guba EG. Naturalistic Inquiry. 1985.Newbury Park, CA: SAGE.
5. Charmaz K. Constructing Grounded Theory. 2014.2nd ed. Thousand Oaks, CA: SAGE.
6. Denzin NK, Lincoln YS. The SAGE Handbook of Qualitative Research. 2005.3rd ed. Thousand Oaks, CA: SAGE Publications.
7. Morse JM. Critical analysis of strategies for determining rigor in qualitative inquiry. Qual Health Res. 2015;25:1212–1222.
8. Lincoln YS. Emerging criteria for quality in qualitative and interpretive research. Qual Inq. 1995;1(3):275–289.
9. Patton MQ. Enhancing the quality and credibility of qualitative analysis. Health Serv Res. 1999;34(5 pt 2):1189–1208.
10. Shenton AK. Strategies for ensuring trustworthiness in qualitative research projects. Educ Inf. 2004;22(2):63–75.
11. Tracy SJ. Qualitative quality: Eight “big-tent” criteria for excellent qualitative research. Qual Inq. 2010;16(10):837–851.
12. Pillow W. Confession, catharsis, or cure? Rethinking the uses of reflexivity as methodological power in qualitative research. Int J Qual Stud Educ. 2003;16(2):175–196.
13. Denzin NK. Interpretive Interactionism. 1989.Newbury Park, CA: SAGE Publications.
14. Creswell JW. Qualitative Enquiry and Research Design: Choosing Among Five Approaches. 2007.2nd ed. Thousand Oaks, CA: SAGE Publications.
15. Geertz C. Martin M, McIntyre LC. Thick description: Toward an interpretive theory of culture. In: Readings in the Philosophy of Social Science. 1994:Cambridge, MA: MIT Press; 213–231.
16. Thorne S. Interpretive Description. 2008.Walnut Creek, CA: Left Coast Press Inc..
17. O’Cathain A, Thomas KJ. “Any other comments?” Open questions on questionnaires—A bane or a bonus to research? BMC Med Res Methodol. 2004;4:25.
18. Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15:1277–1288.
19. Miles MB, Huberman AM, Saldana J. Qualitative Data Analysis: A Methods Sourcebook. 2013.3rd ed. Thousand Oaks, CA: SAGE Publications.
20. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101.
21. Fairclough N. Discourse and text: Linguistic and intertextual analysis within discourse analysis. Discourse Soc. 1992;3(2):193–217.
22. Myers KA, Zibrowski EM, Lingard L. A mixed-methods analysis of residents’ written comments regarding their clinical supervisors. Acad Med. 2011;86(10 suppl):S21–S24.
23. Ginsburg S, Gold W, Cavalcanti RB, Kurabi B, McDonald-Blumer H. Competencies “plus”: The nature of written comments on internal medicine residents’ evaluation forms. Acad Med. 2011;86(10 suppl):S30–S34.
24. Silverman D. What counts as qualitative research? Some cautionary comments. Qual Sociol Rev. 2013;9(2):48–55.
© 2018 by the Association of American Medical Colleges