Home Current Issue Previous Issues Published Ahead-of-Print Collections For Authors Journal Info
Skip Navigation LinksHome > September 2012 - Volume 87 - Issue 9 > Student Uncertainties Drive Teaching During Case Presentatio...
Academic Medicine:
doi: 10.1097/ACM.0b013e3182628fa4
Case Presentations

Student Uncertainties Drive Teaching During Case Presentations: More So With SNAPPS

Wolpaw, Terry MD, MHPE; Côté, Luc MSW, PhD; Papp, Klara K. PhD; Bordage, Georges MD, PhD

Free Access
Article Outline
Collapse Box

Author Information

Dr. Wolpaw is associate dean, Case Western Reserve University School of Medicine, Cleveland, Ohio.

Dr. Côté is professor, Department of Family Medicine and Emergency Medicine, Laval University, Quebec City, Quebec, Canada.

Dr. Papp is associate dean, State University of New York, Downstate Medical Center, Brooklyn, New York. When this study was conducted, she was associate professor, Department of General Medical Sciences, Case Western Reserve University School of Medicine, Cleveland, Ohio.

Dr. Bordage is professor, Department of Medical Education, University of Illinois at Chicago, Chicago, Illinois.

Correspondence should be addressed to Dr. Wolpaw, Case Western Reserve University School of Medicine, Office of Curricular Affairs, T402, 10900 Euclid Ave., Cleveland, OH 44106-4924; telephone: (216) 368-6989; e-mail: Terry.wolpaw@case.edu.

Collapse Box

Abstract

Purpose: To compare the nature of uncertainties expressed by medical students using the six-step SNAPPS technique for case presentations (Summarize history and findings; N>arrow the differential; Analyze the differential; Probe preceptors about uncertainties; Plan management; Select case-related issues for self-study) versus those expressed by students doing customary presentations and to elucidate how preceptors respond.

Method: The authors performed a secondary analysis in 2009 of data from a 2004–2005 randomized study, comparing SNAPPS users’ case presentations with other students’ presentations. Authors coded transcriptions of audiotaped presentations to family medicine preceptors for type of student uncertainties, nature of preceptor responses, alignment of preceptor responses with uncertainty types, and expansion of preceptors’ responses beyond addressing uncertainties.

Results: The analysis included 19 SNAPPS and 41 comparison presentations. SNAPPS students expressed uncertainties in all case presentations, nearly twice as many as the comparison group (χ21df = 12.89, P = .0001). Most SNAPPS users’ uncertainties (24/44 [55%]) focused on diagnostic reasoning compared with 9/38 (24%) for comparison students’ (χ21df = 8.08, P = .004). Uncertainties about clinical findings and medications/management did not differ significantly between groups. Preceptors responded with teaching aligned with the uncertainties and expanded 24/66 (36%) of their comments.

Conclusion: Students can drive the content of the teaching they receive based on uncertainties they express to preceptors during case presentations. Preceptors are ready to teach at “the drop of a question” and align their teaching with the content of students’ questions; these learning moments—in context and just-in-time—can be created by students.

Although presenting cases to preceptors is certain to be part of medical training, the nature of the teaching and learning that occur during these case presentations is unclear. One validated, six-step, learner-centered case presentation technique that facilitates the expression of clinical reasoning and uncertainties during case presentations is SNAPPS (Summarize relevant patient history and findings; Narrow the differential; Analyze the differential; Probe the preceptor about uncertainties; Plan management; Select case-related issues for self-study).1,2 The SNAPPS technique rebalances the proportion of facts and reasoning that students provide during a case presentation but does not necessarily lengthen it. When using this technique, the learner focuses, first, on reporting relevant, sufficiently detailed aspects of the history and findings (condensing the data portion of the case presentation); then, he or she uses the remainder of the case presentation to express clinical reasoning and uncertainties. Step 1, the concise reporting of relevant data obtained from the history, physical examination, laboratory testing, and imaging, is followed by five steps that facilitate the expression of diagnostic reasoning and case-related uncertainties (see Appendix).

Instructions Given t...
Instructions Given t...
Image Tools

During the fourth step of a SNAPPS case presentation—the Probing step—the student expresses uncertainties to the preceptor by asking questions about difficulties, areas needing clarification, or alternative approaches. This step guarantees the learner an opportunity to obtain not only clarification but also feedback that focuses on his or her immediate needs. For students, resolving personal difficulties or uncertainties can maximize learning.3 For teachers, the pairing of a student’s expression of uncertainty with corresponding feedback presents an ideal learning moment in the context of patient care.

When students express uncertainties, they are acknowledging that their understanding is incomplete. Cooke, Irby, and O’Brien4 have stressed the importance both of conceding incomplete comprehension and of asking questions as means for learners to develop strategies for personal growth. Expressing uncertainties and asking clarifying questions is a way for students to improve knowledge, strengthen decisions, and advance their critical thinking skills.5

When a student expresses an uncertainty during a case presentation, the uncertainty serves as a needs assessment for the teacher and offers an opportunity for a just-in-time teaching and learning moment. The student reveals something about which he or she is confused, unclear, or uncertain. The teacher can provide feedback that directly targets the student’s learning needs. These learning moments—as clearly captured in Ericsson’s6 theory of expertise development (i.e., combining deliberate mixed practice with feedback)—play a key role in the development of clinical reasoning. Ericsson proposes that the amount of deliberate practice with feedback correlates with the level of performance, and that deliberate practice with the receipt of feedback must be sustained over long periods to achieve maximal skill levels.6 Expressing uncertainties and receiving feedback become the yin and yang—the com­plementary means—to learning and developing expertise in the busy patient care setting.

There is very little research on the characteristics of student uncertainties expressed during case presentations.7 In a 2004–2005 study of the SNAPPS technique, students asked questions of their preceptors as an integral part of their case presentations. The results indicated that SNAPPS students expressed uncertainties more than twice as often as students who used traditional presentation formats.2 Although the results from that study provided important insights about the student’s role during case presentations, the study did not explore the nature of the uncertainties that students expressed, nor the responses that preceptors gave. Further, we do not know whether the uncertainties the SNAPPS students express differ from those expressed by students using traditional case presentation formats.

Thus, in 2009 we performed a secondary analysis of the case presentations from our 2004–2005 SNAPPS study. Given the importance of mixed practice with feedback in the development of expertise,6 we were interested in the types of uncertainties students express and in the responses preceptors provide during case presentations. The current study addresses two specific research questions: What is the nature of the uncertainties that students express during case presentations, specifically, during SNAPPS presentations as compared with during traditional presentations? And, what is the extent and nature of the preceptors’ responses to students’ uncertainties?

Because the SNAPPS case presentation technique, which includes narrowing and analyzing the differential, serves as a cognitive forcing strategy for the expression of clinical diagnostic reasoning,1,2 we hypothesized that the uncertainties students expressed to preceptors using the SNAPPS technique for case presentations would reflect the technique’s emphasis on clinical reasoning. We also hypothesized that the uncertainties students expressed using other presentation techniques would reflect an emphasis on the details of the history, physical examination, and/or management plan.

Back to Top | Article Outline

Method

Design

We performed a qualitative content analysis8 of student case presentations to preceptors from our 2004–2005 SNAPPS study.2 For that original SNAPPS study, we used a posttest-only, comparison-groups randomized design and involved 64 medical students during their family medicine clerkships.

During the 2004–2005 academic year, we randomly assigned students to one of three case presentation training groups: SNAPPS training, feedback training (controlling for training time), and usual-and-customary instruction. The intervention involved the preceptors as well as the students. Students and preceptors provided informed consent. We offered no incentives for participation. Participation was voluntary and independent of assessment (for students) and of promotion (for preceptors). The study coordinator securely stored all tapes and data.

The SNAPPS preceptors received an in-person, 20-minute orientation that took place 2 weeks before the clerkship. Their orientation included an 11-minute instructional video on the SNAPPS technique and an opportunity to ask questions. These preceptors also received a small card listing the six SNAPPS steps and, the day before the clerkships began, a phone call reminder that their students would be using the SNAPPS technique. The orientation for the second group of preceptors (the feedback group) comprised a telephone call (of about 10 minutes) two weeks before the clerkships began, during which we discussed the importance of feedback and provided the opportunity for the preceptors to ask questions. The day before their students arrived, these preceptors received a phone call reminding them that the students would be looking forward to receiving feedback from them. Finally, the third (usual-and-customary) group of preceptors received only a telephone call explaining that students would be arriving in two weeks.

The students in the SNAPPS group participated in a lunchtime training session and two lunchtime follow-up sessions. During the first session, of about 45 minutes, they watched the 11-minute SNAPPS instructional video and role-played a SNAPPS case presentation. They had the opportunity to ask questions, and they received a card listing the SNAPPS steps. During the two follow-up sessions (20 minutes each), the students practiced SNAPPS case presentations and discussed any barriers to using the method in the office setting. The students in the second group (the feedback group) also participated in three lunchtime training sessions of, respectively, 45, 20, and 20 minutes. These students learned, rather than the SNAPPS technique, the skills of asking preceptors for feedback. Finally, the usual-and-customary group did not receive any specific feedback or case presentation training. They conducted their case presentations in the manner they chose.

During the last week of their four-week family medicine rotation, the students audiotaped as many of the case presentations they made to their preceptors as they could (see also the 2009 study2).

We made two methodological modifications to the original study. In the 2004–2005 study, we coded only the uncertainties expressed after the first step of the SNAPPS technique. In the present study, we coded any and all of the uncertainties expressed throughout the entire case presentation. Second, because the two comparison groups in the 2009 SNAPPS study (feedback and usual-and-customary) did not differ significantly on any of the measured outcomes, we collapsed these groups into a single comparison group for our current analyses.

The University Hospitals of Cleveland institutional review board for human investigation and the University of Illinois at Chicago office for the protection of research subjects approved the study.

Back to Top | Article Outline
Outcomes

We established two outcome categories for our analyses, one related to student uncertainties, the other to preceptor responses.

Student uncertainties. We analyzed three aspects of student uncertainties: (1) the frequency of case presentations containing uncertainties, (2) the number of uncertainties per case presentation, and (3) the nature (i.e., content category) of the uncertainties. We operationally defined “uncertainties” as any student expression of difficulties or areas needing clarification, whether formulated as a question or statement. See Table 1 for examples.

Table 1
Table 1
Image Tools

Preceptor responses. We analyzed three aspects of preceptor responses: (1) the presence or absence of a response to a student uncertainty, (2) the nature (i.e., content category) of the preceptor response, and (3) whether the preceptor’s response went beyond answering the student’s uncertainty (expanded to include additional information).

Back to Top | Article Outline
Coding and data analysis

In 2009 we transcribed the audiotape cassettes for each student, labeling each transcription with only confidential identification numbers. A case presentation had to meet the following two criteria to be transcribed: (1) the audio quality of the recording was of sufficient clarity to be understood and (2) the full student–preceptor interaction was included in the recording. The transcriptionist used the first case presentation for each student that met both of the two criteria because in 2004–2005 we found that students’ first presentation was representative of all their case presentations.2

Three coders (T.W., L.C., G.B.) met together across a two-day period in 2009 to code the transcribed case presentations. They used printed copies of the transcribed case presentations labeled with random numbers assigned by the study administrator. Any patient or student identifying information was masked. The three coders, blinded to study group and participant, first read and coded each case presentation independently. They coded the student uncertainties and preceptor responses using an established content analysis procedure.8 They continued to generate descriptive codes until all case presentations were coded. After independently coding each case presentation, the three coders compared their codes and discussed discrepancies until they reached consensus.

We analyzed the results for each outcome using qualitative descriptions as well as frequency distributions and chi-square tests for between-group comparisons. If chi-square tests showed significant differences overall, subsequent planned comparisons were performed. We used the Statistical Package for the Social Sciences (SPSS; version 17; Chicago, Illinois).

Back to Top | Article Outline
Results

Below, we present the number of participants and the results of our content analysis. We have organized our results into student uncertainties and preceptor responses, and we have provided comparisons between presentations from the students in the SNAPPS group and students in the comparison group.

Back to Top | Article Outline
Participants

We included case presentations and corresponding preceptor responses from 60 of the 64 students in the initial study in this secondary analysis, for a total of 60 analyzed case presentations analyzed (19 SNAPPS and 41 comparison). We were unable to locate audiotapes for four students (one from each of the original two comparison groups and two from the SNAPPS group).

Back to Top | Article Outline
Student uncertainties

Frequency of case presentations containing uncertainties. Students expressed uncertainties in two-thirds of the 60 case presentations (41/60; 68%). Every student in the SNAPPS group expressed at least one uncertainty during each case presentation (19/19; 100%) compared with 54% of the students (22/41) in the comparison group (χ21df = 12.89, P = .0001).

Uncertainties per case presentation. Overall, students expressed a total of 82 uncertainties during the 41 case presentations that actually contained one or more uncertainties. For those presentations that included uncertainties, the students using the SNAPPS technique on average expressed 2.3 (SD 1.2) uncertainties per case presentation compared with 1.7 (SD 0.9) for those in the comparison group (F1,39 = 3.24, P = .79).

The nature of uncertainties. The qualitative analyses yielded a total of 16 distinct types of uncertainties. We later grouped these into three broad categories: diagnostic reasoning, clinical findings, and medications/management (see Table 1). A 2 × 3 chi-square analysis comparing SNAPPS and the comparison group across the three uncertainty types showed statistically significantly differences overall (χ22df = 8.28, P = .02). We conducted orthogonal planned comparisons to further explore these relationships. Over half of the uncertainties that students expressed when using SNAPPS were about diagnostic reasoning (55%; 24/44) compared with fewer than a quarter of uncertainties (24%; 9/38) for students in the comparison group (χ21df = 8.08, P = .004). A second planned comparison looking at group differences within the clinical findings and within the medications/management categories revealed no significant differences (χ21df = 0.21, P = .64). See Figure 1.

Figure 1
Figure 1
Image Tools
Back to Top | Article Outline
Preceptor responses

Presence or absence of a response. Overall, preceptors responded to 80% (66/82) of the uncertainties expressed by the students. Preceptors responded to 82% (36/44) of the uncertainties expressed by students using the SNAPPS technique compared with 79% (30/38) of the uncertainties expressed by students in the comparison group (χ21df = 0.11, P = .74).

Nature of the response. We analyzed preceptor responses to student uncertainties with the same codes and categories used for student uncertainties. All of the preceptors’ responses to student uncertainties (66/66) aligned with the type of uncertainty the students had expressed, regardless of the study group they were in; for example, when a student expressed an uncertainty about diagnostic reasoning, the preceptor gave a response about diagnostic reasoning.

A 2 × 3 chi-square analysis comparing preceptor responses to uncertainties expressed by SNAPPS and comparison students across the three uncertainty types showed statistically significantly differences overall (χ22df = 9.48, P = .009). We conducted orthogonal planned comparisons to further explore these relationships. Almost two-thirds of preceptor responses to uncertainties expressed by students using the SNAPPS technique were about diagnostic reasoning compared with fewer than a quarter of preceptor responses to uncertainties in the comparison group (SNAPPS: 23/36 [64%] versus comparison: 7/30 [23%], χ21df = 9.48, P = .002). A second planned comparison looking at group differences in preceptor responses to uncertainties about clinical findings and medications/management revealed no significant differences (χ21df = 0.002, P = .968). See Figure 2.

Figure 2
Figure 2
Image Tools

Expanded responses. We detected no significant differences between SNAPPS and comparison groups in the number of expanded preceptor responses—that is, the number of responses that went beyond answering the uncertainty (χ21df = 0.31, P = .58). Preceptors in the SNAPPS group gave responses that went beyond the student’s uncertainty a third of the time (33.3%, 12/36), compared with two-fifths (40%, 12/30) for preceptors in the comparison group. Examples of preceptor responses to student uncertainties are provided in Table 2.

Table 2
Table 2
Image Tools

Preceptor uncertainties. Overall preceptors expressed their own uncertainties five times during three of the 60 case presentations. SNAPPS preceptors expressed uncertainties twice, and comparison preceptors expressed uncertainties three times. Two uncertainties were related to clinical findings, three to medications/management.

Back to Top | Article Outline

Discussion

The 2009 SNAPPS study found that students who used the six-step SNAPPS method for case presentations expressed questions and uncertainties more often than those in the comparison groups.2 The results of the current study extend our earlier findings, shedding light on the nature of the questions and uncertainties that students express during their case presentations and on the teaching they receive in return. Our current results show both that students can initiate teaching if they ask questions of their preceptors during their case presentations and that the content of the students’ questions can drive the content of the preceptors’ teaching.

As hypothesized, students using the SNAPPS technique expressed more uncertainties about diagnostic reasoning compared with the comparison students who focused their uncertainties mostly on clinical findings and medications/management. This difference in favor of diagnostic reasoning for the SNAPPS students reflects the emphasis SNAPPS places on the expression of diagnostic and clinical reasoning. Two of the six SNAPPS steps focus specifically on diagnostic reasoning, namely, Step 2 (Narrowing the differential diagnosis) and Step 3 (Analyzing the differential). This structured process for learners presenting their diagnostic reasoning provides the preceptor with a direct window into what the students do and do not understand. In the traditional comparison case presentations, which focus more on the transmission of clinical data and management plans,9 student uncertainties predominantly centered, as expected, on clinical findings and medications/management issues.

The results from the present study show that when students share uncertainties with preceptors, the preceptors engage in a reciprocal teaching interaction based on those uncertainties. When students expressed uncertainties during case presentations, preceptors most often responded (80% of the time), regardless of the study group. The student’s question triggers a teaching–learning moment. No matter how busy preceptors are, no matter how many tasks they are balancing, they most often stop and teach when a student expresses an uncertainty. Preceptors are, in general, ready to teach at “the drop of a question.”

Preceptors responded to student uncertainties with teaching focused on the student’s question. If a student asked a question about medications, the preceptor responded with teaching about medications. If a student asked a question about clinical reasoning, the preceptor responded with teaching about clinical reasoning. The nature of the student uncertainty drove the content of just-in-time teaching and feedback. This experiential learning, as Dewey3 indicated nearly 80 years ago, maximizes the chances for student learning and long-term retention.

Because a student’s question not only provided the trigger for creating the learning moment but also defined the content of the learning conversation, the SNAPPS students who expressed more uncertainties about clinical reasoning received significantly more teaching about clinical reasoning than did their peers. Given that diagnostic errors account for a significant portion of errors in patient care,10 teacher–learner conversations must include such diagnostic and clinical reasoning content. Generally, diagnostic errors in internal medicine are not the result of inadequate medical knowledge but rather of inappropriate cognitive processing or poor self-monitoring of the reasoning process.11

The student’s case presentation to the preceptor—and the educational exchange that presentation stimulates between them—not only has the potential to address learner uncertainties and difficulties, but also provides an opportunity for the teacher to give feedback to learners about their diagnostic reasoning. This feedback is important because, as Coderre et al12 have demonstrated, the quality of the reasoning strategies that medical students use influences their diagnostic success.

An in-context and just-in-time learning and teaching moment can be created by students using SNAPPS; the student asks a question about the current case, and the preceptor responds with teaching. Discussing, in situ, specific aspects of the case a student is currently working on effectively makes use of the power of double encoding—cognitive and experiential—in the student’ memory, which, in turn enhances retention and recall later on.13

The SNAPPS technique is as feasible as it is theoretically sound. The number of uncertainties students expressed in their case presentations, about two per presentation, is ideal for brief, focused teaching and feedback. A few focused learning points can be inserted into the case presentation without significantly lengthening it.2

The results from this present study help to further define the content of the learning conversation—the cognitive dance between teachers and learners—that can take place during case presentations. In addition, the SNAPPS technique helps to align teacher and learner expectations of case presentation content14 by clearly stipulating that expressing uncertainties is an obligatory part of the case presentation.14 The SNAPPS case presentation technique not only provides a method-driven assurance that students will express an uncertainty, but also initiates a cycle of practice-with-feedback, which is, much of the time, focused on clinical reasoning.

The present study, while elucidating some of the key aspects that shape the teaching–learning moment, also opens up important areas for further research. Because the current study focused on the expression of student uncertainties and preceptor responses in an outpatient family medicine setting, findings may not generalize to other settings. Studies looking at students’ expression of uncertainties and preceptors’ responses—within the context of other specialties with their own unique teaching and learning challenges—would provide insights beyond the scope of the current work. In addition, the inpatient setting is ripe for studies of time-efficient teaching and learning techniques. The present study has focused on teaching in response to student questions and uncertainties, but has not explored learning and behavior change based on preceptor feedback. A study that uses a longitudinal design would add insights about long-term learning in response to feedback during case presentations. Also, examining both spontaneous preceptor teaching during case presentations and its alignment with students’ learning needs would be fruitful.

Continuing to explore steps and strategies that enhance the learning moment between the student and teacher should be a priority. Learner development in framing questions based on self-assessment of learning needs could optimize teaching during case presentations. For example, Egan’s15 series of three workshops with third-year medical students on elements of effective questioning skills may offer ways to strengthen students’ expression of uncertainties. Preceptors need to empower students, first, to think about the kind of teaching they need—diagnostic reasoning, clinical findings, medications/management—and, then, to frame questions so that they (their teachers) can provide that teaching. Teachers listening to case presentations should also encourage the expression of questions that focus on the doctor–patient relationship, shared decision making, and tolerating uncertainty. And finally, educators need to look for the most effective ways to teach in response to student questions.

The SNAPPS technique is a simple tool, yet it serves as an effective cognitive forcing strategy that transforms students’ clinical reasoning and uncertainties from private thoughts to accessible, discussable teaching opportunities for teachers. To close, one student applying SNAPPS reflected, “When you have time to ask questions and get them answered, that’s when you learn the most, especially if it’s in the context of a patient because that’s when you remember the answers.” Her reflection illustrates two of the strengths of SNAPPS—that is, the best learning emanates from one’s own difficulties,3 and double (cognitive and experiential) encoding in memory improves retention and recall.13

Funding/Support: None.

Other disclosures: None.

Ethical approval: The University Hospitals of Cleveland institutional review board for human investigation and the University of Illinois at Chicago office for the protection of research subjects determined that the research protocol met the criteria for exemption.

Previous presentations: The authors presented a version of this report under the title “SNAPPS: Expression of student uncertainties can drive the quantity and types of teaching during case presentations to preceptors” as part of the Research in Medical Education Conference during the 2010 Association of American Medical Colleges Annual Meeting, November 2010, Washington, DC. In addition, the four authors presented this study at the 2010 Association for Medical Educators in Europe Meeting in Glasgow, Scotland, August 2010.

Back to Top | Article Outline

References

1. Wolpaw TM, Wolpaw DR, Papp KK. SNAPPS: A learner-centered model for outpatient education. Acad Med. 2003;78:893–898

2. Wolpaw T, Papp KK, Bordage G. Using SNAPPS to facilitate the expression of clinical reasoning and uncertainties: A randomized comparison group trial. Acad Med. 2009;84:517–524

3. Dewey J How We Think. 1933 New York, NY Health

4. Cooke M, Irby D, O’Brien B. Educating Physicians: A Call for Reform of Medical School and Residency. 2010 San Francisco, Calif Jossey-Bass

5. Scott JN, Markert RJ, Dunn MM. Critical thinking: Change during medical school and relationship to performance in clinical clerkships. Med Educ. 1998;32:14–18

6. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl):S70–S81

7. Kilminster SM, Jolly BC. Effective supervision in clinical practice settings: A literature review. Med Educ. 2000;34:827–840

8. Thomas DR. A general inductive approach for analyzing qualitative evaluation data. Am J Eval. 2006;27:237–246

9. Molodysky E. Clinical teacher training—Maximising the ‘ad hoc’ teaching encounter. Aust Fam Physician. 2007;36:1044–1046

10. Graber M, Gordon R, Franklin N. Reducing diagnostic errors in medicine: What’s the goal? Acad Med. 2002;77:981–992

11. Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med. 2005;165:1493–1499

12. Coderre S, Mandin H, Harasym PH, Fick GH. Diagnostic reasoning strategies and diagnostic success. Med Educ. 2003;37:695–703

13. Eva KW, Neville AJ, Norman GR. Exploring the etiology of content specificity: Factors influencing analogic transfer and problem solving. Acad Med. 1998;73(10 suppl):S1–S5

14. Haber RJ, Lingard LA. Learning oral presentation skills: A rhetorical analysis with pedagogical and professional implications. J Gen Intern Med. 2001;16:308–314

15. Egan ME. Soliciting Feedback by Asking Questions That Promote Thinking Among Medical Students: A Pilot Study [thesis]. 2001 Chicago, Ill University of Illinois, Chicago

Back to Top | Article Outline
Appendix

© 2012 by the Association of American Medical Colleges

Login

Article Tools

Images

Share