Secondary Logo

Share this article on:

Clinical Reasoning

Survey of Teaching Methods and Assessment in Entry-Level Physical Therapist Clinical Education

Huhn, Karen, PT, PhD; Black, Lisa, PT, DPT; Christensen, Nicole, PT, PhD, MAppSc; Furze, Jennifer, PT, DPT, PCS; Vendrely, Ann, PT, EdD, DPT; Wainwright, Susan, PT, PhD

Journal of Physical Therapy Education: September 2018 - Volume 32 - Issue 3 - p 241–247
doi: 10.1097/JTE.0000000000000043
Research Report

Background. Clinical reasoning skills have been identified as an integral part of physical therapist education developed in both the academic and clinical settings. There is limited understanding of clinical instructors (CIs) expectations related to the anticipated level of clinical reasoning (CR) skills achieved by students and how CR is taught and assessed in the clinic.

Objective. The purpose of this research was to explore how CIs teach and assess student physical therapists' CR skills and garner an understanding of CIs' expectations related to CR skills attained at various levels of clinical experience.

Design. A descriptive, cross-sectional survey was used to gather data from CIs.

Methods. Participants identified from an American Physical Therapy Association database of credentialed CIs were asked to complete an 18-question electronically delivered survey.

Results. A nationwide sample of 749 CIs completed the survey. The data clearly demonstrated that CIs value and recognize the importance of clinical education in the development of CR. They also value the use of engaged discussion and reflection to foster the development of reasoning skills. Most CIs did not expect students to achieve entry-level CR skills during the third year.

Conclusions. Clinical instructors see the importance of clinical experiences and engaged discussion to foster the development of CR. Further research is needed to understand why most clinical educators did not expect students to achieve “entry-level” CR skills as defined by the Clinical Performance Instrument by the final year of the program.

Karen Huhn, is an assistant professor, 65 Bergen St, Newark, Rutgers The State University of New Jersey, Newark, NJ huhnka@shrp.rutgers.edu

Lisa Black, is an associate professor, Director of Clinical Residency Education Programs, Creighton University, Omaha, NE lisa.black@creighton.edu

Nicole Christensen, MAppSc, is an associate professor, Samuel Merritt University, Oakland, CA nchristensen@samuelmerritt.edu

Jennifer Furze, is an associate professor, Creighton University, Omaha, NE, Jennifer.Furze@creighton.edu

Ann Vendrely, is a professor of Physical Therapy, 1 University Parkway, Governors State University, University Park, IL 60484 avendrely@govst.edu

Susan Wainwright, is an associate professor, Thomas Jefferson University, Philadelphia, PA Susan.Wainwright@jefferson.edu

The authors report no conflict of interest.

Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal's Web site (www.aptaeducation.org)

Supplemental Digital Content is Available in the Text.

Received January 18, 2018

Accepted February 27, 2018

Back to Top | Article Outline

INTRODUCTION AND BACKGROUND

The dynamic and rapidly changing environment of health care increasingly demands a high level of accountability for practitioners to produce effective, meaningful improvement in patients. Physical therapists rely on their clinical reasoning (CR) process to decide how to manage a patient to facilitate maximal improvement. This CR process is a highly complex, multi-faceted skill that is developmental in nature and requires a substantial amount of practice with realistic patients to develop. The American Physical Therapy Association (APTA) and The Commission on Accreditation in Physical Therapy Education recognize the importance of CR skills by mandating it as an expected outcome of both education and practice.1,2

Our understanding of the CR process used by physical therapists is still evolving and is substantially derived from work in medicine and nursing. What is known is that the process includes several different types of reasoning and that a therapist will use more than 1 type of reasoning in any given situation meaning that it is context dependent.3,4 In addition, it is believed that developing therapists move through a continuum of skill development. A lack of clinical experience leads novices to rely heavily on basic science knowledge.5 As they gain some experience in clinical practice through their clinical affiliations, they begin to chunk information together or build scripts related to patients they have seen. This script formation makes the CR process less cognitively cumbersome and thereby more efficient and effective.6 As a therapist continues to gain clinical experience, the CR process becomes even more refined; therapists can quickly and efficiently identify what should be done based on previous patient experiences.

Learning theories that support the development of CR skills include constructivism, in which the main tenet is that knowledge should be gained in the context in which it is to be used.7 Situated learning stresses the importance of learning and participating in an activity to form a professional identity rather than learning “facts.”8 Deliberate practice theory supports the use of longitudinal mentoring again in realistic situations to help identify specific activities to be deliberately practiced to improve performance.9 Given this knowledge that CR skill development is highly contingent on clinical experience, the role of clinical instructors (CIs) in supporting students' development of CR skills becomes increasingly evident.

The Clinical Reasoning Curricula Assessment and Research Consortium (CRCARC) (http://acapt.org/index.php/consortia/itemlist/category/33) of the American Council of Academic Physical Therapy was created to develop best practice standards for teaching and assessing CR in physical therapist education. In listening to discussions among consortium membership at several meetings, it became apparent that members were not all talking about CR in the same manner. CR was being defined, taught, and assessed with substantial variability among both academic and clinical educators. To move the consortium's agenda forward, it was decided that a better understanding of how both academic and clinical educators were defining, teaching, and assessing CR was required. Consortia members designed two surveys; 1 directed at academic educators, the results of which are published elsewhere10 and a second one directed at clinical educators. Both surveys included a definition of CR consortium members derived from the literature to encourage respondents to consider CR from a similar perspective. The operational definition of CR was: “Clinical reasoning is a nonlinear, recursive cognitive process in which the clinician synthesizes information collaboratively with the patient, caregivers, and the healthcare team in the context of the task and the setting. The clinician reflectively integrates information with previous knowledge and best available evidence to take deliberate action.”11-13

Results of the survey of academic educators indicated a lack of consensus on how CR is taught and assessed in academia. Most academic educators were using multiple frameworks, tools, and assessments and relying on instructor-created tools.10 As stated earlier, CIs play a significant role in the growth of students' CR skills. Therefore, researchers believed that it was important to present the views of CIs in conjunction with those of the academic educators. This project reports on the findings of the second survey directed at clinical educators. The purpose of this research was to explore the perceptions of CIs regarding teaching and assessing student physical therapists’ CR skills. A second aim was to promote greater awareness of the state of CR in physical therapist clinical education.

Back to Top | Article Outline

METHODS

A descriptive cross-sectional survey designed to explore how CR is defined, taught, and assessed in academic DPT programs was modified to examine how CR is defined, taught, and assessed by CIs. The study was reviewed and approved by the Institutional Review Boards at Creighton University, Thomas Jefferson University, Governors State University, and Rutgers the State University of New Jersey.

Back to Top | Article Outline

Development of the Survey

Members of the CRCARC designed a survey intended to assess how academic educators define, teach, and assess CR skills. Consortia members then refined the academic survey to gather the same information and additional data relevant to teaching CR skills in the clinical education setting. Specifically, the branched questions asked CIs to describe the level of CR ability (beginner, advanced beginner, intermediate, advanced intermediate, and entry-level) expected of affiliating students during clinical experiences in their first, second, and last year of professional education. These questions used terminology from the Clinical Performance Instrument (CPI) to define the forced choice categories.14 The survey provided the following rating scale from the CPI: beginner = 100% supervision; advanced beginner = 75–90% supervision with patients with simple conditions, 100% supervision with patients with complex conditions; intermediate = <50% supervision with patients with simple conditions, 75% supervision with patients with complex conditions; advanced intermediate = independent with patients with simple conditions, <25% supervision with patients with complex conditions or new patients, 75% caseload management; and entry-level = capable of managing patients with simple or complex conditions 100% of the time. There was also interest in understanding whether CIs encouraged students to self-reflect and whether they believed that academic programs adequately prepared students for the clinic. The final CI survey (Appendix 1, Supplemental Digital Content 1,http://links.lww.com/JOPTE/A23) consisted of 18 total questions, including six branched questions directly related to CR and 12 on demographics and experience levels of CI respondents.

A sample of convenience drawn from an APTA database of credentialed CIs received the survey. Approximately 40,000 credentialed CIs were included in the database. Only 6,983 allowed their email addresses to be used for research purposes. Survey results were anonymous as emails, and the names of respondents were not visible to the researchers. No attempt was made to meet a representative sample because of the limited number of emails available. The authors did not have access to US postal addresses, so no attempt was made to send paper surveys. QuestionPro Online Survey Software was used to deliver the survey and compile results. APTA staff members managed the distribution of the survey. The survey software identified who had not yet completed the survey and sent two later email requests.

QuestionPro software calculated frequencies of response. Four of the researchers (K.H., L.B., S.W., and A.V.) individually coded the open-ended responses and then discussed findings through several conference calls leading to the development of a coding scheme representative of the participants' responses. Reliability of the content analysis was confirmed by 100% percent agreement among the researchers.15 These codes were then categorized into themes. Trustworthiness of the data was ensured through presentation of direct quotes from the participants to support the themes.15

Back to Top | Article Outline

RESULTS

The survey was delivered electronically by the APTA to 6,983 email addresses, 4,780 of which were deliverable. Two subsequent email requests were sent to those who did not complete the survey. Final survey results indicate that 1420 participants viewed the survey email, 934 opened the survey, 885 began it, 770 completed all but the demographic questions, and 749 completed the entire survey for an overall response rate of 15%.

A spreadsheet of geographical distribution and highest degree earned of all credentialed CIs was obtained from the APTA. The demographics of respondents to geographical distribution and data of the highest degree of all CIs were compared to decide whether the sample was a true representation of CIs (Figures 1 and 2). Results indicated a good match for general representativeness for geographic location. The exception was a lack of survey respondents whose highest degree earned was a certificate as our survey had 0%. In the sample, 72% of respondents reported holding the basic level of APTA CI training, whereas 18.5% had advanced certification, and 8.8% had not completed the APTA credentialing course. The survey was originally sent only to credentialed CIs. However, CIs were allowed to share it with colleagues and that may account for the low number of respondents who were not credentialed. In the sample, 87% of respondents had not completed a residency or fellowship, and 67% had not completed clinical specializations. The 33% who achieved American Board of Physical Therapy Specialties certification represented eight specialty areas, 15% Orthopedics, 5% Neurology, 4% Sports, 3% Geriatrics, 3% Pediatrics, 1% Cardiovascular and Pulmonary, 1% Women's Health, and .5% Clinical Electrophysiology.

Figure 1

Figure 1

Figure 2

Figure 2

The vast majority (98.3%, n = 736) of CIs concurred with the definition of CR outlined by the CRCARC, and 91% (n = 681) of CIs indicated that they explicitly taught CR skills to their students. Clinical instructors were asked to rate the level of CR skills they expected from the students they supervised. The rating scale from the CPI as listed earlier in this study was provided.

When asked to rate the level of CR skills CIs expected from students participating in a clinical experience in their first professional year, 37.8% (n = 283) indicated beginner, whereas 48.4% (n = 362) indicated advanced beginner. Students participating in clinical experiences in their second professional year were primarily expected to have intermediate-level skills 64.2% (n = 480). Students participating in clinical experiences in their last professional year were primarily expected to be at advanced intermediate 50% (n = 374) or entry-level 42.4% (n = 317).

The survey asked respondents to indicate which tools they use to assess students' CR skills. Fifty-four percent (n = 404) reported using the CPI, and 15 percent (n = 112) reported using the Guide to Physical Therapist Practice.16 Respondents had an opportunity to enter “other” tools they use to assess CR (n = 52) (Table 1).

Table 1

Table 1

Most CIs (71%, n = 531) responded that academic programs adequately prepare students to use CR skills effectively in the clinical setting. Twenty-nine percent (n = 217) of respondents did not believe that academic programs adequately prepared students. The survey asked those respondents to indicate which of the following they believed would help students: patient simulations (28.55%), mentoring programs (23.93%), more clinical fieldwork (21.3%), specific courses (19.83%), or other (6.32%). Again, respondents had an opportunity to enter “other” suggestions to facilitate students' CR skills in the didactic portion of their education. Investigators evaluated these 37 responses for themes. Table 2 provides the most frequently reported themes with sample responses.

Table 2

Table 2

Respondents were then asked whether they encourage students to engage in self-reflection on reasoning abilities and, 88% (n = 659) reported they do. A follow-up question asked how they encourage self-reflection, and respondents were provided the following options: debriefing with CI (44.51%), think aloud during treatment or other (27.04%), and journal assignments (16.7%). Once again, respondents entered information in response to the “Other” prompt in this question, and 191 responses were entered. The authors evaluated these “other” responses for themes (Table 3).

Table 3

Table 3

Back to Top | Article Outline

DISCUSSION

The purpose of this research was to explore the perceptions of CIs regarding expectations for student physical therapists' CR skills and to gain insights into how these skills are defined, taught, and assessed. Survey results indicated that CIs agree with the Consortium's definition of CR. Respondents requested that the definition be kept simple without too much jargon. The authors believe that this is the first time that interested stakeholders have been asked to agree or disagree with a definition of CR. CIs expect a developmental improvement in CR skills with movement from advanced beginner to intermediate over the first 2 years. Just over half of the respondents indicated that they expected students would achieve entry-level skills by the time they graduate; however, 49.8% of respondents reported that they expect a student to achieve advanced intermediate CR skills rather than entry-level in their final professional year. Advanced intermediate was defined as independent with simple patients, <25% supervision with complex or new patients, and management of 75% of a caseload.

There are a few possibilities to explain why half of the surveyed CIs do not expect students to achieve entry-level skills relative to CR. One possible factor is the descriptions provided for rating students. The description of Entry-level is 100% of caseload with no provision for supervision or assistance, a 25% increase in caseload management from the next lowest rating of advanced intermediate, and elimination of any type of supervision. Sass et al17 found evidence that clarification of terms has an impact on how CIs apply ratings. They provided CIs with an entry-level definition based on but not identical to the one used in this study and reported that CIs believed more clarification was needed to understand terms such as “highly skilled” and “complex”.17 A second possibility is that many CIs may not consider the achievement of “entry-level” CR ratings by students to be a realistic expected outcome of students before entering independent practice. The findings of Sass et al. support this hypothesis when they reported that their respondents also questioned the ability to achieve a high level of competence in all practice settings.17

Findings from a study of clinical mentors in nursing suggest that registered nurses and academic faculty differed in their perceptions surrounding the level of clinical skills first-year students should have during their first clinical experience. Clinical mentors emphasized a range of basic skills and acknowledged that many skills could only be learned through experience.18 In addition, the nursing literature identifies a gap between knowledge and skills required in academic contexts and those needed for clinical practice.19 Published research findings clearly describe observed differences in the ways novice and expert clinicians clinically reason and self-reflect in practice.20

Clinical instructors use multiple “formal tools” to assess CR, with most (54%) reporting using the CPI and the majority reporting use of more than 1 tool. Many were also using a variety of facility-specific tools, indicating a lack of a “gold standard” to assess CR skills across clinical education sites. The data represent more than seven hundred responses to this question, which may indicate the level of importance and value CIs place on assessing CR skills. It is interesting that only 54% of CIs reported using the CPI as a tool to assess CR. Considering this finding in light of the limited scope of criteria within the CPI that directly address aspects of CR performance, it is not surprising that CIs use other means of assessing this complex, multi-faceted aspect of clinical practice. It is possible that CIs value a more interactive method of assessing CR. The data support this interpretation. Responses of the “other” tools used to assess CR included statements related to discussion, think-aloud activities, and observation of student performance. Our findings are congruent with recommendations in the educational literature promoting the use of direct observation and think-aloud activities to support the development of CR skills in future health professionals.11

Most CIs (71%) believed that academic programs adequately prepared students to use CR in the clinic. Feedback to academic programs from CI respondents about how to improve DPT students' CR skills fell into two categories: 1) increasing opportunities students have in the classroom to learn the skills necessary for effective CR and 2) increased and varied experiential learning activities. Specific to providing more opportunities to have faculty model and students demonstrate CR skills, CIs suggested techniques (Table 3) they use in the clinic be applied consistently in the classroom. This raises additional questions about what the most effective educational strategies are when teaching CR in an educational setting as compared to the clinical setting. In the academic learning context, there are typically a large number of students and small number(s) of faculty as compared to the clinical setting where most commonly only 1 or two students are working with a single CI. There is likely a continuum of CR teaching and facilitation strategies that should be explored to determine best practices in both didactic and clinical education to foster effective development of CR skills throughout entry-level education.

Intentional and explicit application of reflection to the reasoning process is necessary to develop the reasoning skills consistent with expert practice.21 Clinical instructors' clearly value the role of reflection, with 97% indicating that they encourage student reflection. This finding is consistent with previous reports that critical reflection contributed to student learning by increasing validation, empowerment and broadening perspectives,22 and developing CR skills.23,24 Analysis of free text responses indicated that CIs encourage both reflection-for-action (thinking back on an experience and exploring the reasons around and the consequences of, their actions) and reflection-on-action (planning for future experiences within the context of past experiences.25,26 Reflection-for-action was evidenced by CIs' numerous reports of requiring students to plan for patient interactions by discussing or writing a plan of care before treatment or using evidence to plan a treatment. Reflection on action seemed to fall into two categories, including asking students to reflect on their performance through journaling, discussion with CI or others, and setting weekly goals for themselves. In addition, CIs encouraged students to reflect on patient performance to formulate goals and adjust plans of care. This facilitated reflection is consistent with recommendations from the literature. Schon identified that reflection is much more effective when it is facilitated by a mentor and done as a social activity.26

Reflection, with context, dialog, and time are key elements of the critical-thinking process necessary to arrive at a clinical decision.23 The interaction reported between CIs and student physical therapists in the free text responses provides time and opportunity for dialog to facilitate reflection within the situated learning environment of clinical practice. Further, the CIs reported frequent use of think-aloud and dialoged interactions to facilitate this reflection. As these students are novice, their use of reflection is notably observed when confronted with ambiguity in clinical practice.27 Consistent with a previous study of novice physical therapists, reflection-in-action was not a focus of CI and student interaction.28 Although DPT students may be limited in their ability to use reflection-in-action to assess the patient, as well as their own performance, acknowledging the value of developing these skills would serve to further develop CR abilities in both academic and clinical settings.

The findings of this survey provide a more complete picture of what is occurring in professional DPT education when combined with published results about CR in the academic setting.6 It is clear that educators in both academic and clinical settings value development of CR skills. Instruction in CR is incorporated into both academic and clinical settings using a variety of methods that seem to be both setting and instructor specific. There are marked differences in how CR is assessed across the academic and clinical settings. Traditional written and practical examinations, as well as the CPI, are predominately used in the academic setting.10 Although the CPI is also used in the clinical setting, CIs rely on a variety of CR assessment tools, application of evidence-based practice, CR frameworks, and their own judgment when assessing student CR abilities.

There are several limitations in this work. The overall response rate was 15%. Although this is clearly a low response rate, recent research has called into question the assumption that a high response rate indicates quality results. There has been no identification of a strong relationship between nonresponse rates and nonresponse bias.29 In addition, response rates for web-based surveys are typically lower than those of paper-based surveys.30 We used a sample of convenience, which limits the ability to generalize the findings. There is the possibility of coverage error, which can occur when a list of potential subjects does not include all members of the group.31 In the case of this survey, the sample includes only those CIs who agreed to share their emails, which was not the entire list of credentialed CIs. Capturing the views from all CIs would be challenging but perhaps attempting to reach CIs through the various geographical consortiums would yield a greater return.

It is clear from the respondents that CIs place value and importance on the facilitation of CR skills and on their role as educators. They recognize the importance of clinical experience in the development of CR and suggest more and varied clinical experiences. These CIs also value the use of engaged discussion and reflection to foster the development of reasoning skills. The findings provide an understanding of the current state of CR assessment and instruction, which may help formulate the most effective curricular activities across both academic and clinical settings. The finding that 49.8% of CIs do not expect students to reach entry-level CR skills at the end of education warrants further exploration. Delving deeper into why CIs believe entry-level CR skills are not expected and whether there are certain aspects of entry-level CR skills students are not expected to achieve might provide insights as to changes that might need to be made to physical therapist education.

Back to Top | Article Outline

ACKNOWLEDGMENTS

The authors acknowledge the members of the Clinical Reasoning Curricula and Assessment Research Consortium and the American Council of Academic Physical Therapy/Lisa McLaughlin.

Back to Top | Article Outline

REFERENCES

1. Commission on Accreditation for Physical Therapy Education. Standards and Required Elements for Accreditation of Physical Therapist Education Programs. CAPTE; Alexandria, VA USA 2014.
2. Association. APT. Physical therapist clinical performance instrument. 2006. http://www.apta.org/PTCPI/. Accessed June, 2016.
3. Edwards I, Jones M, Carr J, Braunack-Mayer A, Jensen G. Clinical reasoning strategies in physical therapy. Phys Ther. 2004;84(4):312–330.
4. Higgs J, Jones MA, Loftus S, Christensen N. Clinical Reasoning in the Health Professions. 3rd ed. London: Elsevier; 2008.
5. Boshuizen HP, Schmidt HG. On the role of biomedical knowledge in clinical reasoning by experts, intermediates and novices. Cogn Sci. 1992;16:153–184.
6. Norman G. Building on experience-The development of clinical reasoning. N Engl J Med. 2006;355(21):2251–2252.
7. Jensen G, Mostrom E. Handbook of Teaching and Learning for Physical Therapists. 3rd ed. Saint Louis, MO: Elsevier; 2013.
8. Ratcliffe T, Durning S. Theoretical concepts to consider in providing clinical reasoning instruction. In: Trowbridge R, Rencic J, Durning S, eds. Teaching Clinical Reasoning. Philadelphia, PA: American College of Physicians; 2015.
9. Ericsson KA. An expert-performance perspective of research on medical expertise: the study of clinical performance. Med Edu. 2007;41(12):1124–1130.
10. Christensen N, Black L, Furze J, Huhn K, Vendrely A, Wainwright S. Clinical Reasoning: survey of teaching methods, integration and assessment in entry-level physical therapist academic education. J Phys Ther. 2016;97(2):175–186.
11. Higgs J. Fostering the acquisition of clinical reasoning skills. New Zealand J Physiother. 1990;18:13–17.
12. Higgs J, Jones M. Clinical reasoning in the health professions. 2nd ed. Boston, MA: Butterworh Heineman; 2002.
13. Newble D, Norman G, VanDer Vleuten C. Assessing Clinical Reasoning. 2nd ed. Oxford: Butterworth-Heinemann; 2000.
14. Association APT. Physical therapist clinical performance instrument (PT CPI). http://wwwapta.org/PTCPI/. Accessed March, 2016. Updated June 15, 2016.
15. Patton M. Qualitative Evaluation and Research Methods. 2nd ed. Thousand Oaks, CA: Sage Publications; 1990.
16. American Physical Therapy Association Guide to physical therapist practice 3.0. American Physical Therapy Association; Alexandria, VA, USA 2015.
17. Sass K, Frank L, Theile A, et al. Physical therapy clinical educators' perspectives on students achieving entry-level clinical performance. J Phys Ther Edu. 2011;24:46–59.
18. Astin F, McKenna L, Newton J, Moore-Coulson L. Registered nurses' expectations and experiences of first year students' clinicial skills and knowledge. Comtemporary Nurse. 2005;18(3):279–291.
19. Santucci J. Facilitating the transition into nursing practice: concepts and strategies for mentoring new graduates. J Nurses Staff Dev. 2004;20(6):274–285.
20. Norman G, Brooks L, Cunnington JPW, Shall V, Marriott M, Regehr G. Expert-novice differences in the use of history and visual information from patients. Acad Med. 1996;71(10):S62–S64.
21. Wainwright SF, Shepard KF, Harman LB, Stephens J. Novice and experienced physical therapist clinicians: a comparison of how reflection is used to inform the clinical decision-making process. Phys Ther. 2010;90(1):75–88.
22. Scully R, Shepard K. Clinical teaching in physical therapy education: an ethnographic study. Phys Ther. 1983;63:349–358.
23. Forneria S. Exploring the attributes of critical thinking: a conceptual basis. Int J Nurs Scholarship. 2004;1(9):1–18.
24. Dreifuerst K. Using debriefing for meaningful learning to foster development of clinical reasoning in simulation. J Nurs Edu. 2012;51(6):326–333.
25. O'Donnell A, Reeve J, Smith J. Educational Psychology: Reflection for Action. 3rd ed. Hoboken NJ, ed. Wiley & Sons; 2011.
26. Schon D. The Reflective Practitioner: How Professionals Think in Action. New York, NY: Basic Books Inc; 1983.
27. Mamede S, Schmidt HG, Rikers RMJP, Penaforte JC, Coelho-Filho JM. Breaking down automaticity: case ambiguity and the shift to reflective approaches in clinical reasoning. Med Edu. 2007;41(12):1185–1192.
28. Embrey DG, Guthrie MR, White OR, Dietez J. Clinical decision making by experienced and inexperienced pediatric physical therapists for children with diplegic cerebral palsy. Phys Ther. 2006;76(1):220–214.
29. Curtin R, Presser S, Singer E. The effects of response rate changes on the index of consumer sentiment. Public Opin Q. 2000;64:413–428.
30. Cook C, Heath F, Thompson R. A meta-analysis of response rates in web- or internet-bases surveys. Educ Psychol Measurement. 2000:60(6):821–836.
31. Draugalis J, Plaza C. Best practices for survey research reports revisited: implications of target population, probablity sampling, and response rate. Am J Pharm Edu. 2009;73(8).
Keywords:

Clinical Reasoning; Assessment; Teaching and Learning

Supplemental Digital Content

Back to Top | Article Outline
Copyright 2018 Education Section, APTA