Deming, Nicole MA; Fryer-Edwards, Kelly PhD; Dudzinski, Denise PhD, MTS; Starks, Helene PhD, MPH; Culver, Julie MS; Hopley, Elizabeth; Robins, Lynne PhD; Burke, Wylie MD, PhD
Many people say that it is the intellect which makes a great scientist. They are wrong: it is character. - —Albert Einstein
Federal and institutional research rules and guidelines exist to protect the safety of human subjects and provide guidance for the responsible conduct of research. However, recent reports of significant adverse events at major research institutions raise questions about the kind of oversight needed to prevent such occurrences.1–3
One approach to improve the safety of research involving human subjects is through educational interventions aimed at fostering the integrity of scientists conducting research.4 The core content areas for research ethics education usually are federal regulations. The three principles of respect for persons, beneficence, and justice as outlined in the Belmont Report5 and in discussions of historical cases of research abuse such as the U.S. Public Health Service Study at Tuskegee6 and the Willowbrook study.7 These three components provide guidelines and general goals and help link investigators to a common history concerning research with human subjects. However, although principles are a necessary component for ethical decision making, they are rarely sufficient in and of themselves.
Researchers are faced with daily decisions that are subtle and nuanced, requiring professional judgment to determine appropriate actions. Often, discussion of general ethical principles and careful review of historical cases of blatant research abuse are not enough to provide researchers with guidance for “on the ground” decisions. Other traditions within ethics such as casuistry, feminist ethics, and virtue ethics8 have enriched the analysis of daily dilemmas by focusing more attention on context-, climate-, and systems-based concerns and on core commitments of researchers. These additional ethical frameworks help to further illuminate how scientists make decisions and what can be done to promote and reinforce ethical behavior.
Studies suggest that professionals are influenced by climate, institutional social structures, and ethical norms derived from sources other than research integrity guidelines.9–10 The interviews we conducted for this study comprise the first phase of a larger study examining integrity in research. These interviews were designed to elicit from researchers the common dilemmas faced in research involving human subjects. The interviews also explored how scientists manage dilemmas that arise in the course of their research work. We were interested in whether and how the Aristotelian concept of phronesis, or practical wisdom, was reflected in researchers’ thinking when faced with dilemmas. Practical wisdom focuses on particulars and aims toward action and judgment and is gained through experience. Because practical wisdom is acquired only through apprenticeship and habitual practice,11 medical educators and physicians have found this concept useful when describing judgments that are made in the context of clinical practice.12–13 We apply the concept here to make sense of our participants’ responses and to suggest a supplemental approach to research ethics education.
We used snowball sampling methods to recruit senior scientists and research administrators who conduct studies involving human participants at the University of Washington (UW) (which is categorized as a Research I university) and its affiliated research centers in 2004. We employed snowball sampling methods because we wanted to elicit examples of everyday ethical dilemmas in research from experienced and respected researchers. We sought out researchers who were regarded as mentors and leaders in areas of good research practice and who were seen as models of integrity. By choosing researchers based on reputation, we increased our likelihood of speaking to researchers who had thought about ethical issues in their work, which facilitated the level of insights obtained in the space of a short interview. Our target sample size was 20 participants, each representing five areas of research: (1) tissue and cell culture, (2) population based, (3) behavioral and social science, (4) clinical trials, and (5) research administration. A total of 36 potential participants were approached, and we invited at least three researchers from each area to participate.
The objective of the interviews was to collect stories and develop vignettes of research dilemmas to use in the next phase of our research, which was designed to systematically examine how scientists would address, analyze, and act in response to a series of ethical problems. For the initial phase of the research, we conducted semistructured interviews to investigate these participants’ perceptions of institutional norms and expectations for research integrity in studies involving human subjects. We used an interview guide to explore how the researchers we interviewed defined research integrity, and we probed for examples of lapses in research integrity they had witnessed, what kinds of resources they found useful in resolving ethical dilemmas, and what they thought about current ethics training required by the National Institutes of Health. Three members of the research team conducted the interviews (ND, KFE, JC). Interviewers were assigned to participants with whom they had no prior relationship. We obtained written consent at the start of each interview. All interviews were conducted in person and were audio recorded and transcribed. The transcripts were reviewed, and all personally identifying information was removed. The University of Washington institutional review board approved all study procedures.
We analyzed the transcripts using constant comparison methods to categorize and describe common research conflicts and strategies identified by participants.14 This process involved multiple readings of the transcripts and discussion of emerging themes and possible conceptual models, and then additional review of the transcripts. We developed a coding scheme based on the interview guide; we refined and expanded this scheme with new codes as we identified additional concepts and categories in the transcripts. To increase the reliability of the coding process, a minimum of two researchers independently assigned codes to the transcripts and then compared their coding. A third team member reviewed any discrepancies that could not be resolved through consensus. We used Atlas.ti software version 5.0 (Atlas.ti Inc., Cologne, Germany) to manage the data and analytic process. For the analysis presented in this paper, the entire team reviewed excerpts from transcripts to identify the strategies and reasoning that participants used in response to ethical dilemmas.
Of the 36 researchers we invited, 23 participated (four did not reply, eight did not self-identify as currently doing human participants research, and one requested deferral). At least three researchers and a maximum of six were recruited from each research area.
We report on two strategies used by our study participants when they were faced with dilemmas. We relate our analysis of these strategies to two general frameworks when approaching ethical dilemmas in human subjects research: (1) reliance on ethical principles, rules, or regulations; and (2) practical wisdom. For our purposes, practical wisdom referred to a researcher’s use of intuitions about right and wrong based on previous professional experiences, personal values, and actual experience with research practices, which shape core commitments and judgments about appropriate scientific conduct. We report examples from the two approaches, along with strategies for developing and maintaining practical wisdom, below. We found that scientists rely on both principles and practical wisdom at multiple points in resolving research dilemmas. The typical approach included defining ethical problems, identifying alternatives, and determining the best course of action.
Study participants mentioned general principles in the form of rules, guidelines, ethical principles, or regulations when identifying or negotiating dilemmas in research. Participants within all five research areas described using principles to critically review study designs, communicate standards within the scientific community, and illustrate examples of unethical behavior. When asked to describe the decision-making process and the use of regulations and principles as decisional guides, one senior administrator responded:
I do use them, primarily in the process of drafting informed-consent documents, and figuring out how I’m going to use consent. I take the three principles, respect, beneficence, and justice, and I use those as I draft the proposal to make sure that subject selection is equitable, that I’m actually considering the characteristics of the subject population in the broadest possible terms, that I’m looking at issues of respect, in terms of not lying to people, not withholding information, and beneficence in terms of making it clear to subjects what I hope will be the downstream benefit: trying to justify their time.
These general principles helped ensure that the research conducted was respectful of human subjects, ultimately beneficial to society, and distributed risks and benefits of the research among populations. The application of principles allowed participants in our study to speculate about possible adverse results and potential problems before they occurred. Thus, by anticipating outcomes and scenarios, clear examples of unacceptable consequences were identified and prevented. One population-based scientist identified the well-known rules governing publication:
There are sort of three absolutes of publication or, you know, research. You shouldn’t falsify, you shouldn’t fabricate, you shouldn’t plagiarize, right? So those are sort of the three deadly sins.
However, participants also recognized that simply stating that something is unacceptable is not enough to guarantee compliance with rules. As another population-based researcher stated,
Falsification of data—misrepresentation of experimental results—is just simply unacceptable. And they [scientists] should feel that in their heart, and not just be told that it’s unacceptable!
The “feeling in their heart” can be described as intuition, which suggests implicit understanding of why the rules are important and why they matter for research integrity.
Our study participants commented that teaching the principles of respect for persons, beneficence, and justice is often limited to the examination of blatant examples of unethical behavior, such as the experiments conducted by Nazi scientists or the U.S. Public Health Service Study at Tuskegee. Although they acknowledged that researchers must be aware of these past atrocities, they noted that a consequence of limiting the discussion to these extreme cases is that researchers will not be able to relate or apply these principles to their own research practice, where they are likely to face less dramatic dilemmas. For example, one research administrator explained the disconnect that occurs between historical cases and individual researchers:
They [researchers] look at those examples and see them as examples of unethical investigators. And they think to themselves, “Research integrity or research ethics is not an issue for me because I’m not a Nazi or because I’m an ethical person or I have a conscience.” Or they read stories and they read the NIH guide and they see the reports of scientific misconduct and read about people who have fabricated their data and again they see those examples and think, “I don’t have to worry about issues in research ethics or research integrity because I wouldn’t do something that’s so flagrantly unethical.” And they miss the fact that most of the issues in research ethics and integrity for sort of the day-to-day investigator are not issues that are kind of flagrant violations of ethical principles and norms and that they’re much more subtle issues.
Investigators need to be able to identify and address the day-to-day issues that arise within their own environment. Participants were specific about the usefulness of regulations and principles to serve as broad guidelines for appropriate actions; however, they also acknowledged the limitations of relying on these mechanisms alone. Participants in our study noted that sometimes it was unclear how a general ethical principle applied to a complex, concrete situation. They also noted that different research teams varied in their interpretation and application of the rules. Participants also spoke of how rules they had previously thought they understood became increasingly difficult to apply as they gained more research experience or as the research environment changed over time. As one clinical trials researcher said,
Traditionally I was sort of taught that you don’t offer incentives that are coercive. But, increasingly, it’s unclear to me what that really means.
Another cell and tissue researcher commented about authorship guidelines:
There are agreed-upon guidelines that have been published by a number of individuals as to what really should be essential criteria for authorship. How that’s actually practiced deviates around that norm by a fair bit, in my experience. My own view is that those are good guidelines, and I think they’re appropriate interpreting guidelines in a day-to-day context is always a challenge, because ethics guidelines are usually not absolute.
Our participants spoke of many judgments that were ambiguous within their daily research practice; yet, this did not mean that they made decisions arbitrarily. These scientists described knowing what was right without having to appeal to an abstract principle or philosophical theory; instead, they knew intuitively. Such intuition may have been the result of habitual ethical practices that had led these scientists to develop practical wisdom over time. As one clinical trials researcher said:
I couldn’t give you the definition of integrity, other than to say it really distills down to doing the right thing, and in that sense, your head has to listen to your gut and your gut will tell you if you’re doing the right thing, very often. And if it doesn’t feel right, it probably isn’t ethical or of integrity That presumes, of course, that you have a reasonable ethical compass.
We heard that following one’s intuition is a common practice when faced with a dilemma. Implicit in this practice is an assumption that one’s moral compass is well calibrated. Within a virtue ethics framework, practical wisdom is built by forming habits in the context of professional practice/experience, which allows one to intuit and quickly interpret ethical rules pertinent to the situation at hand. For example, the rule “do not lie” means something different at the ages of 6, 16, and 60 because both an individual’s conceptual understanding of the rule and the environments in which the rule is applied alter over time. It is a challenge of maturity and professional development to learn how to exercise good judgment in novel and complex situations. In the results that follow, we characterize three behaviors used by study participants to calibrate their own intuitions: self-reflection, sincere skepticism, and, most important, open dialogue with peers and team members.
Calibrating intuition and developing practical wisdom
Researchers discussed self-reflection as a way to ensure the unbiased representation of findings and to challenge the investigators to assess their own motivations and judgments. In the words of one social science researcher,
I think there are so many ethical issues if we sort of broaden out our thinking about how to do research, that they kind of never end. And the thinking about it has to be integrated into everyday life, pretty much. Everything you do, you stop and ask yourself: Is this the right thing to do? Why do I think that? Why do I think it’s not? [Laughs.] How do I best better address it?
The temptation to allow personal preferences to influence study results was discussed in detail by participants from all research areas. They discussed the need to critically assess study findings, along with personal motivations, to counter the understandable desire to present their data in the most positive light. According to one population-based researcher,
A sincere skepticism, probably more than anything else, informs your interpretation of data, and it’s a healthy expectation that you may easily find nothing. And being willing to accept that an answer such as no association is as good a scientific answer as association.
Participants identified open dialogue with their colleagues and professional community as one of their greatest resources for addressing ethical dilemmas. They suggest that the habits and practices formed through participation in the professional community help them keep their moral intuition, or practical wisdom, calibrated. One social scientist said,
I think another important thing is that you’re active in the research community, that you go to seminars and colloquia, and you hear colleagues talk about their research in the process of it. These kinds of dilemmas come up all the time … because you’re part of that discourse, it enables you to filter better… . One cannot lock himself or herself in a closet and just be a researcher and totally rely on their own sense of how to handle questions on that sort of thing. It’s absolutely essential, I think, that you stay part of the collective.
Participants emphasized the need to stay engaged with their colleagues about all kinds of ethical issues, especially in an ever-changing social and regulatory environment. Another participant from research administration commented:
What happens in society over time, of course, [is that] standards just change and evolve, and what [at] one point in time might be considered very acceptable conduct no longer is, and vice versa, and it’s just as often this kind of fabric of social control, and general belief [in] acceptable conduct is what’s actually really important. So that means that people need to have contact with their peers and their managers. They shouldn’t be working in isolation.
This insight suggests that the community nurtures practical wisdom by requiring dynamic reflection on professional values that are often taken for granted.
Our study indicates that researchers benefit from developing practical wisdom to assist them in applying ethical principles to a range of decisions that occur during the course of their research. Formal guidelines are important to establish a common language and history; however, knowing what the rules are is not sufficient to ensure observance of those guidelines. The researchers we interviewed referred to an internal sense, or intuition, about right and wrong that guided them in making judgments in daily research decisions and that motivated them to act. Certainly, students enter into research training with a learned sense about right and wrong from prior life and work settings. Because practical wisdom is context specific, trainees and researchers must learn to make judgments within the specific setting of research (e.g., what is the difference between cleaning data and manipulating it? What data do you include in a paper, and which do you leave out? What contributions count for authorship? How much information is enough for informed consent?). A research training program has a responsibility to cultivate the appropriate sensibility in researchers as they develop professionally.
Research ethics education could do more to capitalize on researcher motivations and intuitions and to play a role in shaping those instincts through clear articulation of professional responsibility. Medical ethics education has used virtue ethics as a tool for thinking about professional development and for teaching professional responsibility.15–16 Moral psychology has long recognized the need to attend to motivation alongside reasoning to promote good judgment and action.17 Research ethics educators can learn from these successful models and improve institutional contexts and practices for training scientists. From both disciplines, we have learned that ethical lapses can occur for multiple reasons: lack of recognition of the issues, lack of reasoning ability, lack of motivation and commitment, and inability to act (Table 1). One can hypothesize that research ethics education has a role to play in cultivating all four skills that could minimize such ethical lapses. A combination of educational approaches needs to be employed, and attention to both principles and cultivating practical wisdom might better align the program to the demands of practice.
One strategy for designing an educational program to incorporate the concept of practical wisdom would be to examine further how individuals develop their intuitions. For Aristotle,11 phronesis is cultivated over time through practice. As practice becomes habitual, one’s character is shaped. On this view, practical wisdom is something that can be learned.13 These behaviors are adopted through a combination of training, observation, and experience, and this process is shaped by one’s interactions with the community. According to Aristotle, “We become just by the practice of just actions, self-controlled by exercising self-control, and courageous by performing acts of courage.”11 Over time, as researchers observe ethical practices in respected colleagues and participate in discussions of ethical dilemmas and solutions with these colleagues, they begin to embody exemplary ethical behaviors that become integrated into their moral character. Researchers’ actions become apt and wise because they are informed by both communal and individual experience, and their conduct reflects ethical values that are so ingrained that they become intuitive for the researcher. We know from work examining expert and novice decision making that expert-level professionals often make judgments without being able to unpack the steps that helped them get to their decisions.18 A novice, in contrast, needs to think through all the steps on his or her way to a decision. Beyond modeling good practice, the expert must also be willing and able to share his or her expert insights and reflections so that the novice can better understand the deliberation that led to a conclusion or action. One challenge for research training is that if expert judgments are made without explicit descriptions of the intuitions that shaped those judgments, novices has few opportunities to learn the nuances required for each judgment.
Researchers are given guidelines, mandatory training, and forms to complete as ways to guarantee ethical conduct. The findings from our study suggest that it may be even more important to ensure that they receive adequate mentoring and support from their colleagues and professional community to practice ethical behavior. Community support and encouragement help researchers withstand pressures to behave unethically. Given this twofold approach to training, we need to be aware that within some research contexts, individuals are allowed and sometimes pressured to work in isolation. Novice researchers also work with mentors and colleagues who may not always appropriately model ethical practice and who may not provide or create opportunities for reflection or exposure to alternative perspectives and practices.19–20 If we take the need for dialogue and collegial interaction seriously, institutions and training programs may decide to create opportunities and support systems for this to occur.
Current research educational methods are narrowly tailored and need to expand beyond the skills of ethical deliberation. Because moral development is a continual process influenced by experience and environment, educational methods should use this natural occurrence and address additional areas of moral failings. To successfully address ethical dilemmas, scientists need to develop skills to recognize, analyze, internalize, and act ethically. Phronesis provides a framework under which all of these skills can be addressed through habit and mentorship (Table 1).
Principles and rules cannot solve all ethical dilemmas, because instances occur in which valid arguments can be made for opposing actions. In these situations, the researcher must identify the significant features of a problem, choose which principle should be given priority, reflect on the options, and decide which action is the right choice. There is a need for training programs and professional development in research to focus on the cultivation of practical wisdom to guide these on-the-ground judgments and to establish overarching rules and principles.
Through this preliminary study, we have found that researchers make decisions that feel right to them and that they develop their intuitions through experience, self-reflection, and communication with others. Good research practices take time to cultivate, and there is much potential for ethical lapses. We recommend further research investigating how researchers develop practical wisdom and what steps an institution can take to promote the development of ethical scientists.
This research was supported by the Research Integrity Program, an ORI/NIH collaboration, grant #R01NS44486-01. The authors thank their colleague S. Malia Fullerton, PhD, for her comments on this paper and the ongoing project, and Suzanne Manning, JD, MA, for her help conducting interviews. The authors also gratefully acknowledge the time the participating researchers gave to this project.
1Kempner J, Perlis CS, Merz JF. Forbidden knowledge. Science. 2005;307:854.
2Martinson BC, Anderson MS, de Vries R. Scientists behaving badly. Nature. 2005;435:737–738.
3Steinbrook R. Protecting research subjects: the crisis at Johns Hopkins. N Engl J Med. 2002;346:716–720.
4Committee on Assessing Integrity in Research Environments, National Research Council, Institute of Medicine. Integrity in Scientific Research: Creating an Environment That Promotes Responsible Conduct. Washington, DC: National Academies Press; 2002.
5The National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research. Washington, DC: U.S. Food and Drug Administration, Office of Science Communication and Coordination; 1979.
6Jones JH. Bad Blood: The Tuskegee Syphilis Experiment. 2nd ed. New York: Free Press; 1993.
7Krugman S. Experiments at the Willowbrook State School. Lancet. 1971;1:966–967.
8DuBose ER, Hamel RP, O’Connell LJ, eds. A Matter of Principles? Ferment in U.S. Bioethics. Valley Forge, Pa: Trinity Press International; 1994.
9Hafferty FW. The hidden curriculum, ethics teaching, and the structure of medical education. Acad Med. 1998;69:861–871.
10Martinson BC, Anderson MS, Crain AL, de Vries R. Scientists’ perceptions of organizational justice and self-reported misbehaviors. J Empir Res Hum Res Ethics. 2006;1:51–66.
11Aristotle. Nicomachean Ethics. Indianapolis, Ind: Bobbs-Merrill; 1962.
12Hilton SR, Slotnick HB. Proto-professionalism: how professionalisation occurs across the continuum of medical education. Med Educ. 2005;39:58–65.
13Tyreman S. Promoting critical thinking in health care: phronesis and criticality. Med Health Care Philos. 2000;3:117–124.
14Strauss A, Corbin J. Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory. 2nd ed. Thousand Oaks, Calif: Sage Publications; 1998.
15Pellegrino ED. Professionalism, profession and the virtues of the good physician. Mt Sinai J Med. 2002;69:378–384.
16Jonsen A. Of balloons and bicycles: the relationship between ethical theory and practical wisdom. Hastings Cent Rep. 1991;21:14–16.
17Rest J, Narvaez D, eds. Moral Development in the Professions: Psychology and Applied Ethics. Hillsdale, NJ: Lawrence Erlbaum Associates Publishing; 1994.
18Benner P. From beginner to expert: gaining a differentiated clinical world in critical care nursing. Adv Nurs Sci. 1992;14:13–28.
19Fryer-Edwards K. Addressing the hidden curriculum in scientific research. Am J Bioeth. 2002;2:58–59.
20de Vries R, Anderson MS, Martinson BC. Normal misbehavior: scientists talk about the ethics of research. J Empir Res Hum Res Ethics. 2006;1:43–50.