Skip Navigation LinksHome > October 2009 - Volume 84 - Issue 10 > A Small Grants Program Improves Medical Education Research P...
Academic Medicine:
doi: 10.1097/ACM.0b013e3181b3707d
Professional Assessment and Development

A Small Grants Program Improves Medical Education Research Productivity

El-Sawi, Nehad I.; Sharp, Glynda F.; Gruppen, Larry D.

Section Editor(s): Dewey, Charlene MD; Tariq, Sara MD

Free Access
Article Outline
Collapse Box

Author Information

Correspondence: Nehad I. El-Sawi, PhD, President, KCUMB Institute for Medical Education Innovation, Kansas City University of Medicine and Biosciences, 1750 Independence Avenue, Kansas City, MO 64106-1453; e-mail: (nelsawi@kcumb.edu).

Collapse Box

Abstract

Background: This study compared research collaboration and productivity among applicants to a small educational research grants program.

Method: Brief interviews were conducted with 89% (8/9) of funded applicants and 55% (6/11) of unfunded applicants.

Results: Funded projects had an average 6.6 scholarly products per project and 3.8 interinstitutional collaborators with 72.5% continuing collaborations, compared with the unfunded group that had 2.8 products, 1.8 collaborators, and only 16% continuing collaborations.

Conclusions: This program seems beneficial to research productivity and multiinstitutional collaboration.

The need for better medical education research has been widely noted while, at the same time, a variety of obstacles to this goal are readily apparent.1 A key constraint to improving the quality of medical education research is the lack of funding.2,3 Many educational research efforts suffer from a vicious cycle of dependence on local (institutional) funding to address local problems, with the resulting outcomes being limited to the institutional context and of relatively little generalizability and weak methodology. The perception that medical education research is of low quality decreases the willingness of potential extramural funders to invest scarce resources in what seem to be parochial and narrowly focused studies. This forces investigators to seek support from their institutions and starts the cycle of continuing dependence over again. With an average cost for conducting a study in medical education research of $11,531 to $63,808,2,3 it becomes critical to support efforts to increase funding for this research.

A variety of specialty societies and individual institutions have sought to intervene on the funding issue by developing and offering small grants programs for educational research and development. The funds available are typically less than $10,000 per project, but because of the scarcity of funding for medical education research, these programs often receive many more applications than they can support. These small grants programs serve a policy purpose by emphasizing support for educational research and scholarship, and highlighting a commitment by the sponsor to improving medical education and supporting medical educators.4

However, there are few empirical data to address the question of whether these programs actually promote medical education research productivity. Reed and colleagues,5 in an excellent review, have demonstrated that the quality of the published study is directly related to funding levels. One analysis of an institutionally supported small grants program was able to document a number of beneficial outcomes, including increased funding from extramural sources, peer-reviewed publications and presentations, and an improved environment for recognizing faculty contributions in education.6 Another report noted that a small grant resulted in more than $370,000 in external funding, but there was uncertainty about the benefits of the program to faculty career advancement.4

We were unable to find any prior studies that examined the efficacy and compared outcomes of small grants programs between projects which were funded and those which were not. This is a critical study design characteristic that is necessary to more accurately evaluate program impact. Thus, the present study was designed to evaluate the impact of one such program on research collaboration and productivity by comparing outcomes between applicants who had and had not received funding.

Back to Top | Article Outline

Method

The program

In 2001, the Association of American Medical Colleges’ Central Group on Educational Affairs (CGEA) initiated a small collaborative research grants program for the purpose of promoting collaborative research on medical education within the region. The purpose of the Collaborative Grants Program (CGP) is to promote collaborative projects between CGEA sections (undergraduate medical education, graduate medical education, continuing medical education, and research in medical education), special interest groups, and medical schools that are designed to advance the Central region and the objectives of its sections as a community of educational scholars. Consistent with the criteria for scholarship,7 the results of these projects must be public, available for peer review, and available in a format such that others can build on the work. The maximum funding level is $5,000, and the grants are generally 18 months in length. Investigators can use the funds for administrative and technical support to carry out the project, supplies, communication between participants, and other appropriate research expenses.

Since its inception, the CGP has received 20 applications for funding and has funded nine projects. The topics have been diverse, including such issues as comparing statistical methods for deriving MCAT and GPA thresholds for medical school admissions, evaluating EBM skills with a Web-based objective structured examination, clinical skills evaluation, and comparing medical school prerequisites. Applications are accepted annually and reviewed by a committee of experienced medical education researchers for study quality, innovation, extent of collaboration reflected in the proposal, and feasibility.

Back to Top | Article Outline
Data collection and analysis

The list of prior CGP applicants and contact information was obtained from the CGEA with the approval of the CGEA Executive Committee. This project was reviewed by the University of Michigan institutional review board and was determined to be an exempt study according to 45 CFR 46.101.

Twenty applications were identified between 2001 and 2006. We attempted to contact the principal investigator of record on each application through e-mail to determine their willingness to participate in the study. When there was no response to e-mail, we telephoned the principal investigator.

Brief structured telephone or face-to-face interviews were conducted with the principal investigators from applications to the program. Questions were similar for both funded and unfunded applicants and included questions about the purposes of the research and the application to the CGP, implementation of the proposed study, success and outcomes of the project (if implemented), and the extent to which the project included interinstitutional collaboration. The survey also included a question about the significance of the funding with options of “crucial” (work couldn’t have occurred without it); “important” (facilitated the work but not crucial); or “helpful.” Another question addressed the products that resulted from the project and included presentations, submitted manuscripts for publication, published articles, curricula, and others. We took these claims at face value and did not validate them by attempts to locate citations. We also asked open-ended questions about the impact of CGP funding and any changes that took place as the project was implemented.

Of the 20 applicants to the CGP, we completed data collection for 89% (8/9) of funded applicants and 55% (6/11) of unfunded applicants. There was an overall response rate of 70%. We contrasted quantitative outcomes between the funded and unfunded cohorts with t tests and independent tests of proportions as appropriate. Responses that were not quantifiable (e.g., “extended to all SIG members” for the number of collaborating institutions) were treated as missing data in the analyses. Both parametric (t tests) and nonparametric (Mann-Whitney U and independent test of proportions) procedures were used. Reported results are statistically significant at P < .05 unless noted otherwise.

Back to Top | Article Outline

Results

Table 1 summarizes the results of the completed interviews, segregated by funding status. The frequency and intensity of collaboration among institutions was greater in the funded group than the unfunded group, with an average of 3.8 collaborators per funded project contrasted with an average of 1.8 collaborators per unfunded project (t = 4.69, P < .05, d = 3.1, large effect; U = 25, P < .05). Moreover, the longevity of these collaborations was different between groups. Collaboration continued across multiple institutions within the region for 62.5% of the funded group compared with only 16% for the unfunded group (independent test of proportions, z = 2.00, P < .05, h = 0.98, large effect). For some of the funded group, collaboration did not continue because the project was completed and did not lend itself to continued activity.

Table 1
Table 1
Image Tools

The average number of scholarly products (i.e., papers, posters, and presentations) produced was 6.6 (SD 4.5) for each funded project. For the unfunded group, the average was 2.8 (SD 2.1) products per project. Although this difference is a large effect size (d = 1.16), the small sample sizes did not provide enough power to reach statistical significance.

Qualitative data from the interviews helped to elucidate some of these findings. One of the funded investigators commented that collaboration would have been easier to maintain if the resources had been greater. Indeed, several of the unfunded projects that were nonetheless conducted did so by eliminating the collaborative facet of the study. Several unfunded respondents noted that they were able to implement the project because they received internal, institutional funding after being rejected by the CGP. Indeed, one of these led to a more substantive U.S. Department of Education Fund for the Improvement of Postsecondary Education (FIPSE) funding. This suggests that the unfunded CGP proposals contained high-quality projects that were competitive for other funding sources.

Back to Top | Article Outline

Discussion

It is clear that the funded investigators considered the funding from the CGP to be important to the success of their project. However, this finding should be tempered by the fact that five out of the six unfunded investigators implemented their projects despite not having the funding from the CGP. Some of the funded respondents noted the amount of funding was of less importance to project success than was the commitment to collaboration among institutions. The funding was seen as legitimizing the projects and the collaboration.

Many of the unfunded investigators reinforced the idea that funding, particularly sufficient funding, is important for project implementation, especially when considering the amount of time and effort that goes into developing and carrying out a research project. For many of the unfunded groups that did implement a project, the intended scope of the project was limited by lack of funds. Some indicated that collaboration across institutions did not occur because of the lack of funding. The investigator of the most successful unfunded project, which proceeded with implementation and eventually successfully awarded a FIPSE grant, commented that, if it had been initially funded by the CGP, the pilot data would have represented a joint, collaborative study with a wider implementation scope. We believe that the value of the CGEA grant is more for the institutional commitment, recognition, and prestige than for the monetary value.

The greater productivity of the funded projects may also be a reflection of the greater number of collaborating investigators rather than a primary outcome in itself. Regardless of the dynamics of this increased productivity, these findings suggest that a small grants program focused on collaboration does augment scholarly productivity.

Limitations to this study include a small sample of projects. This is, in part, an unavoidable characteristic of this program, but despite the low level of statistical power, the main results of the study, research productivity and level of collaboration, were both statistically significant, reflecting a large effect of the program. The study also excludes more recent projects, which are unlikely to have been completed or to have produced very many scholarly outcomes at this time.

The equivalence of the intervention and comparison groups is also questionable, given that allocation to each group is clearly not random. The selection process would imply that the unfunded applications are generally of lower quality than the funded projects. However, the reviewers of these proposals routinely acknowledge that the financial constraints require rejection of many high-quality proposals. Therefore, differential quality is a confounder, but not, perhaps, the key factor in these differences. That said, it is difficult to envision an alternative comparison group that does not have similar or even more severe confounding factors.

Back to Top | Article Outline

Conclusions

Despite these limitations, this study provides evidence that a small educational research grants program can have a beneficial effect on research productivity and promote collaboration among institutions. With the 2008 Josiah Macy, Jr. Foundation report on the mission of medical school education8 calling for an increase in the investment in Title VII to support innovations and research in health professions education, it is crucial that more programs to fund this research be initiated and funded at significant levels.

The fact that several of the investigators who were not funded went on to implement their projects may seem to indicate that such funding programs are not absolutely necessary. However, this finding may be less generalizable to programs in which the funding levels are higher, which would make conducting an unfunded study more demanding. The benefits of the CGEA small grants program suggest that these initiatives should be considered carefully by organizations and institutions interested in supporting and promoting medical education research.

Back to Top | Article Outline

Acknowledgments

The authors acknowledge the support of the Central Group on Educational Affairs Executive Committee for authorizing and supporting this study.

Back to Top | Article Outline

References

1 Gruppen LD. Improving medical education research. Teach Learn Med. 2007;19:331–335.

2 Reed DA, Kern DE, Levine RB, Wright SM. Costs and funding for published medical education research. JAMA. 2005;294:1052–1057.

3 Carline JD. Funding medical education research: Opportunities and issues. Acad Med. 2004;79:918–924.

4 Neiman LZ, Kelliher GJ. Stimulating medical education research through small grants. Acad Med. 1991;66:601–602.

5 Reed DA, Cook DA, Beckman TJ, Levine RB, Kern DE, Wright SM. Association between funding and quality of published medical education research. JAMA. 2007;298:1002–1009.

6 Albanese M, Horowitz S, Moss R, Farrell P. An institutionally funded program for educational research and development grants: It makes dollars and sense. Acad Med. 1998;73:756–761.

7 Glassick CE, Huber MT, Maeroff GI. Scholarship Assessed: Evaluation of the Professoriate. San Francisco, Calif: Jossey-Bass; 1997.

8 Cohen JJ. Chairman’s summary of the conference. In: Hager M, ed. Revisiting the Medical School Educational Mission at a Time of Expansion, 2008. Charleston, SC: Josiah Macy, Jr. Foundation; 2008.

Cited By:

This article has been cited 2 time(s).

Academic Emergency Medicine
An Agenda for Increasing Grant Funding of Emergency Medicine Education Research
Choo, EK; Fernandez, R; Hayden, EM; Schneider, JI; Clyne, B; Ginsburg, S; Gruppen, LD
Academic Emergency Medicine, 19(): 1434-1441.
10.1111/acem.12041
CrossRef
Advances in Health Sciences Education
Medical education: a particularly complex intervention to research
Mattick, K; Barnes, R; Dieppe, P
Advances in Health Sciences Education, 18(4): 769-778.
10.1007/s10459-012-9415-7
CrossRef
Back to Top | Article Outline

© 2009 Association of American Medical Colleges

Login

Article Tools

Images

Share