The elderly are especially vulnerable to the problem of domestic abuse. The challenges of aging, including loss of health and stamina, independence, and financial control, create problematic risk factors for individuals over age 60. Elder abuse has many facets, including physical abuse, psychological abuse, financial/material abuse, and violation of personal rights and neglect. Older adults are most often abused by those closest to them such as spouses and children. Between one and two million elderly adults experience abuse annually in the United States.1 However, 84% of cases are not reported to any adult protective service agency.2
Physicians are in an excellent position to notice behavioral and social changes suggestive of abuse. Such behavioral changes may include increased anxiety or depression as well as social withdrawal. Physical evidence for abuse, such as bruising or fractures, would likely be seen in a primary care setting before the patient is referred for specialty care. Given this circumstance, why do physicians comprise only 2% of all elder abuse reporting agents?3 Legislators have enacted mandatory reporting laws in most states; however, most laws provide no funds for facilitating meaningful intervention.4 Reasons for not reporting cases are cited as lack of knowledge, lack of confidence in the legal system, and concern about retaliation.5
Elder abuse continues to be problematic, and exposure to the problem of elder abuse should begin in medical school and continue throughout residency training.6 The Accreditation Council for Graduate Medical Education (ACGME) requires geriatrics training for primary care residencies, such as internal medicine, family practice, and general practice, but the ACGME does not quantify or describe in any specific measure the degree of exposure to elder abuse.7
To understand how elder abuse is being taught in residency programs, we surveyed AGCME-accredited programs in the state of Michigan to answer the following questions:
1. What is the current nature and extent of elder abuse education for physicians-in-training?
2. What are training directors’ attitudes toward the importance and relevance of elder abuse education for their trainees?
3. What is the current state of available elder abuse experiences or materials in residency programs?
Survey development focus group
In June 2006, we convened a focus group to assist with the process of creating a survey instrument and to help focus survey items. The group met for one day to review the literature on elder abuse education, review ACGME guidelines for geriatrics, identify key areas to be included in a survey, and develop survey items. Consensus was achieved through group discussion. The authors of this paper participated as observers in the group process. Focus group members represented medicine, law, social work, education, and statistics. We completed a thorough review of elder abuse literature and the education standards established for each specialty by the ACGME. The focus group reviewed these materials and established questions on elder abuse detection, barriers to education, and community resources.
We mailed our survey to 71 directors (or their designees) of residency programs in the state of Michigan. Targeted programs included all emergency medicine, family medicine, geriatric medicine/family practice, internal medicine, geriatric medicine/internal medicine, neurology, preventative medicine, physical medicine/rehabilitation, and transitional-year programs. To account for the situation in which an older adult might not have an internist or family doctor providing regular direct care, we included specialties that could be responsible for providing initial or continuous medical care for an older adult. The Michigan State University institutional review board approved this survey project.
Survey design and implementation
The mailing materials included a cover letter and survey instrument. The cover letter explained the purpose of the project, confirmed confidentiality for responses, and requested that the participant complete the survey and return it via mail. The 17-item survey instrument (Appendix 1) included a definition of elder abuse at the beginning of the survey. The definition stated, “For the purposes of this survey, elder abuse is defined as physical, sexual, emotional, verbal abuse and/or neglect, abandonment, financial/material exploitation, or self neglect of individuals age 65+.” Items were closed-ended or Likert-scale-based. Demographic items included the number of residents in the program, gender, program specialty, and location. Questions on curriculum examined approach to learning (didactic/clinical/computer), curriculum content, attendance requirements, content role in the overall residency curriculum, and awareness of elder abuse resources in the community. The survey instrument included additional questions on the need and type of related materials that program directors would desire. The final set of survey questions covered clinical contact and included items on the frequency and severity of elder abuse in the clinic population, elder abuse screening activities, and the director’s experience with Adult Protective Services (APS).
We confirmed mailing addresses and contact information using ACMGE rosters. Before distributing the questionnaire, we contacted programs to confirm mailing information. We sent the initial mailing in August 2006, and we sent a second mailing to nonresponders in November 2006. A self-addressed, stamped envelope was included with both mailings to facilitate response.
We entered data into SPSS statistical software (SPSS Inc., Chicago, Illinois) for analysis. Residency programs that indicated at least one method of instruction (didactic, clinical, computer) (Appendix 1, Question 4) or at least one topic taught (Question 5) were coded as having a curriculum. We counted for each program the number of elder abuse topics covered, ranging from 0 to 11. We examined frequency distributions for all variables to identify and correct data entry errors and to examine the range of responses to each question. We performed chi-square analyses using the Fisher exact test to compare categorical variables in cross-tabulation tables. We conducted analyses of variance (ANOVAs) to compare mean values of continuous variables for different groups of programs. We used logistic regression to assess relations between binary variables and categorical and continuous predictors.
Our survey response rate of usable responses was 58% (41/71). Demographic questions on the survey (Appendix 1) asked about size of the program, program specialty, and program location (Questions 1–3). The mean program size was 25.6 (SD = 20.9) residents with 14.7 males and 10.9 females. The breakdown of program specialties was 27% (11/41) family practice, followed by 24% (10/41) internal medicine, 12% (5/41) transitional year, 15% (6/41) emergency medicine, 10% (4/41) physical medicine and rehabilitation, 7% (3/41) internal medicine/geriatrics, and 2% (1/41) preventative medicine and other. The majority of programs were community based (66.9%, 27/41), followed by a mixed university/community setting (9.5%, 8/41) and a university setting (14.6%, 6/41). In terms of cases of elder abuse reported by program, 17% (6/25) reported 0 cases, 37% (13/35) reported between 1 and 4 cases, 23% (8/35) reported between 5 and 9 cases, and 23% (8/35) reported 10 or more cases. The mean number of cases reported was 6.7 (SD 9.31).
Elder abuse curricula
A program was identified as having a curriculum if at least one elder abuse topic (Question 5) was selected or at least one method of delivery of elder abuse material (Question 4) was selected. There were 37 programs which satisfy this criterion. Questions 4 through 8 address elder abuse curricula in the residency programs. More than 90% of the programs (37/41) had some type of elder abuse curriculum. More than half (29/41) of responding programs devoted between one and five hours of lecture hours per year to the subject of elder abuse (Question 4). One-way ANOVAs indicated no significant difference in the rating of the seriousness of the elder abuse problem (Question 12) or in the number of cases of elder abuse seen each year (Question 14) based on the presence or absence of a curriculum.
Question 4 addresses the educational delivery of elder abuse topics, and reported responses are restricted to those 37 programs with elder abuse curricula. For analysis, the categories of inpatient, outpatient, and nursing home were combined into one group and labeled clinical. Of the 37 programs teaching at least one topic, 84% (31/37) used the didactic approach, 11% (4/37) used the computer, and 76% (28/37) used clinical. When considering different combinations of approaches, 11% (4/37) used all three approaches (didactic, computer, and clinical), and 70% (26/37) used both didactic and clinical. The four programs that used computer also used the other two delivery methods. Of the 31 programs offering lectures, only 77.4% (24/31) required attendance at the lectures.
We compared educational content delivery for different specialties and found that 100% (11/11) of the family practice programs used didactic and clinical deliveries compared with 75% (9/12) for the internal medicine programs. We found no significant association between specialty area and number of hours assigned to didactic or clinical curricula. We also found that neither program size nor program location significantly affected the delivery of curriculum.
Table 1 describes 11 elder abuse education topics (Appendix 1, Question 5) and the percentage of programs (out of 41) covering these topics. More than 70% of the programs covered physical presentation of abuse, and more than 70% covered social service support. The topic with the lowest percentage of coverage was epidemiology of elder abuse (22%). Screening instruments for abuse was only covered in 39% of the programs. The average number of topics covered per program was 6, with a standard deviation of 3.8 (n = 36). Of the 41 responding programs, 5 reported no topics and 36 reported one or more topics. So, whereas the mean of 6 (SD 3.8) is based on 41 responses, those reporting no topics are included as valid responses. Our survey simply asked about attendance at elder abuse lectures and did not specify size of lecture, small-group versus large-group format, or degree of resident participation in those lectures offered.
For comparing curricula (topics in Question 5) across specialties in all 41 programs, the specialties in Question 2 are grouped as follows: family practice (n = 11), internal medicine (n = 13), and other (n = 17). Family practice programs covered significantly more topics (mean = 9.18, SD = 1.25) than did internal medicine (mean = 4.85, SD = 3.85) or other programs (mean = 4.33, SD = 3.77) using one-way ANOVA analysis, F = 7.64, P = .002. The percentage of programs in each specialty teaching at least one of the listed topics is 100% (11/11) for family practice, 84.6% (11/13) for internal medicine, and 82.4% (14/17) for other.
Respondents were asked to rate the quality of three characteristics (“practical,” “comprehensible,” and “retainable”) of their elder abuse lectures (Question 6) on a four-point Likert scale (1 = poor; 4 = excellent). Among the 31 programs with a didactic component, 87.1% (27/31) rated the practicality of their lectures as good or excellent, 87.1% (27/31) rated the comprehensibility as good or excellent, and 74.2% (23/31) rated the “retainability” as good or excellent. The average rating over the three characteristics (using the four-point scale) was 3.06, with a standard deviation of 0.55. Thus, for those programs with a didactic delivery, the average rating was good or better.
Screening practice and educational needs
Question 11 (Appendix 1) addresses educational needs for the residency programs. Ninety-three percent (38/41) of all programs wanted at least one type of additional material on elder abuse to help with their educational efforts. Table 2 describes the eight types of materials and the percentage of programs believing that each type of material would be helpful to supplement their current elder abuse curriculum. Of all requested materials, respondents requested screening tools the most by far at 63.4% (26/41), whereas consultation with social workers at 14.6% (6/41) and ethical consultation/presentation at 17.1% (7/41) were the least requested.
We combined types of educational materials into two groups: education (videos, online lectures or courses, educational material, patient information, or screening tools) and consultation (legal, social worker, or ethical consultation). With these classifications, 76% (31/41) of all programs wanted both educational and consultation assistance. Of note, only three programs did not request additional resources. Requesting educational or consultation materials was not significantly related to specialty.
Questions 13a and 13b address screening of patients in hospital or clinical setting, respectively. Fifty-six percent (23/41) of programs do no screening in either hospital or clinical setting. Seventeen percent (7/41) screen only in hospital or only in clinical, and about 10% (4/41) screen in both hospital and clinic. Although no single “gold standard” elder abuse screening tool has found universal acceptance, 63% (26/41) of all programs cited the need for screening materials, far above the need for other materials.
To analyze screening in practice, we defined a new variable—“screen”—defined as screening in either the hospital or clinical setting (general screening). Note that 44% (18/41) of programs did either hospital or clinical screening. Having elder abuse screening as a program topic did not affect the percentage of programs screening for elder abuse in either hospital or clinical settings. Among those programs having screening as a topic, 44% (7/16) did not screen in practice; among those programs not having screening as a topic, 65% (15/23) did not screen in practice. This may reflect the need for better screening tools, additional screening tool education, or greater awareness of the problem or lack of coordination between topics and practice.
Neither program size (Question 1) nor location of program (Question 3) was significantly related to general screening. However, university-based programs were more likely to screen in hospital (Fisher exact test P = .038) than community-based programs. We asked respondents to indicate the availability of community resources to help them treat the abused elder (Question 9). Inpatient medical services (71%, 29/41), psychiatric services (63%, 26/41), and outpatient therapy (63%, 26/41) were the most commonly endorsed modalities for treatment. Use of legal services was 44% (18/41), and use of crisis housing was 29% (12/41). Two surveys were missing responses to these questions. Using 41 as the denominator assumes that nonresponses were “no”.
Question 16 (Appendix 1) addresses the outcome experience in reporting abuse or dealing with APS. Sixty-one percent (25/41) of the programs rated the experience as “poor” (Question 16) because they experienced either no immediate action or no effective intervention with APS. The reporting mechanism was endorsed as cumbersome and contact with APS workers difficult for 30% (12/41) of the programs. On the low side, 2% (1/41) experienced inappropriate identity disclosure of the reporter, 7% (3/41) experienced ineffective assessment, and 12% (5/41) indicated that APS did not effectively intervene for patient safety in the situation.
The relationship between whether the role of elder abuse is major or minor in the program curriculum and the experience with APS was significant (χ2 = 4.118, P = .042). Of the 24 programs in which elder abuse played a minor role in resident education, 79% (19/24) reported a poor APS experience. Of the 13 programs in which elder abuse played a major role, 46.2% (6/13) reported a poor APS experience. The relationships between experience with APS items and program type, program location, or program specialty were not significant.
In examining the relationship between curriculum content and APS reporting, programs with good APS experience had a mean of 7.8 elder abuse topics covered (SD = 2.8), whereas programs with a poor APS experience had a mean of 5.7 elder abuse topics covered (SD = 3.6). Furthermore, logistic regression found that the odds of having a good APS experience increased by 2.38 times for each elder abuse topic covered in the residency curriculum. The type of delivery did not significantly affect the experience with APS.
The results of this survey may not be surprising because they highlight the work that is still needed to improve elder abuse detection and appropriate treatment. Such work includes addressing inconsistencies in the amount and type of curricular attention elder abuse receives in residency programs, increasing the role of elder abuse contacts in resident clinical activity, and improving the awareness and quality of support services for the abused elder.
Residents did not seem to have adequate exposure to elder abuse cases, given that one third of all programs did not offer clinical contact time to learn about this problem. Family medicine programs seem to be more focused on the issue than do other specialties, offering a broader spectrum of both clinical and didactic exposure. Additional work at the ACGME level on defining adequate curricular content might help spur other residencies to increase such curricula and exposure in their programs. The ACGME might consider developing specific topic guidelines that cover the needs expressed by our survey participants, including the epidemiology of abuse, physical signs of abuse, reporting mandates, and community-based resources. Information on screening tools would be helpful to include in ACGME guidelines. Linking these lecture-based topics to required clinical exposure during residency would also be beneficial to trainees. In addition, regular review of elder abuse educational content during residency site visits might also motivate programs to incorporate elder abuse issues into all years of postgraduate training. The ACGME might consider requiring all residents during each year of their training to evaluate and care for elderly patients experiencing elder abuse. Setting this expectation and reviewing resident training logs to determine that such clinical experience has taken place could improve the depth and breadth of residents’ exposure to elder abuse during training. In addition, perhaps practice guidelines should be adopted by each specialty to inform clinicians on strategies for detection as well as the process of effective intervention.
Although many medical schools are moving to Web-based curricula, it is interesting that few residency programs saw Web-based elder abuse training as valuable for their residents. This educational medium might be a good fit for busy clinicians who require flexible educational hours that fit into their clinical work. Web-based resources on elder abuse have different formats, including lecture, video, and interactive blogs. Perhaps additional emphasis on these resources, with reinforcement of concepts learned during clinical patient encounters, could help to meet the need for additional elder abuse educational resources so clearly voiced in this survey.
It could be easier to detect elder abuse if there were a universally adapted, valid, and reliable screening tool for elder abuse in the clinical setting. Clearly, many programs expressed the need for such a tool. Although researchers have studied a number of tools,8–10 many of these instruments are lengthy or require specialized training. Respondents in this survey clearly expressed the need for the development of an easy-to-use and reliable screening tool for elder abuse.
But what happens after a clinician suspects elder abuse? Most programs are aware of and teach reporting mandates for abuse. Of concern, however, are data from this study suggesting that more than two thirds of programs feel APS does not effectively follow up with reported abuse. Clinicians could become frustrated and not report abuse secondary to their belief that little follow-up will occur by APS. This attitude, combined with previous APS reporting experiences, deserves further exploration. It may contribute to physician underreporting, and it highlights the need for additional liaison work between APS and the medical community to streamline referrals and the usefulness of the process.
There is more to be done in elder abuse education within residencies. Standardizing expectations across programs, providing meaningful didactics with adequate clinical exposure, and helping residents to navigate barriers associated with reporting abuse are warranted to improve the status of elder abuse. Because our survey was limited to programs in the state of Michigan, the generalizability of our findings in other areas of the country is unknown. We also acknowledge that reporting bias may affect survey results, such as the number and quality of elder abuse lecture hours described by program directors. We recommend additional research efforts to study regional and specialty differences around the subject of elder abuse education and reporting. Also, future research should focus on strategies to remind physicians to assess for abuse by providing either more practical screening tools or practice guideline prompts for detection and management. Program directors should consider improving educational offerings to take advantage of technologic resources, such as computer-based learning and the Internet. As our elderly population grows, we are increasingly mandated to find more effective means to identify and address abuse among the oldest of our population.
Funding for this survey project was provided in full by a grant from the Blue Cross Blue Shield Foundation of Michigan. The authors wish to acknowledge the help of focus group members for this project, including Drs. Frank Komara, Jed Magen, Lori Post, Dorrie Rosenblatt, Kay Taylor, and Sarah Slocum for their help and consultation in developing the survey instrument.