Skip Navigation LinksHome > April 2006 - Volume 81 - Issue 4 > Managing Resources in a Better Way: A New Financial Manageme...
Academic Medicine:
Institutional Issues

Managing Resources in a Better Way: A New Financial Management Approach for the University of Michigan Medical School

Elger, William R. CPA

Free Access
Article Outline
Collapse Box

Author Information

Mr. Elger is national chair of the AAMC Group on Business Affairs and executive director for administration and chief financial officer, University of Michigan Medical School, Ann Arbor, Michigan.

Correspondence should be addressed to Mr. Elger, Executive Director for Administration and CFO, University of Michigan Medical School, M4114 Medical Science Building I, 1301 Catherine Street, Ann Arbor, MI 48109-0624; telephone: (734) 763-5202; fax: (734) 763-4936; e-mail: 〈welger@umich.edu〉.

Collapse Box

Abstract

Responding to changing trends in how the University of Michigan Medical School (UMMS) has been traditionally financed and anticipating that these trends will continue, in 2002 the executive leadership at the UMMS embarked upon a course designed to change not only the school’s financial structure but its management culture as well. Changing traditional ways of thinking about budgets and developing a set of key performance indicators that demonstrate how certain activities shape the use of resources has brought greater understanding of how to optimize those resources to the greatest extent. Through internally developed Web-based software applications called M-STAT, M-DASH and M-ALERT (which are strategic reporting tools that the author describes), the UMMS now can manage resources in a completely different way. These tools are used to spot general financial trends or examine a more specific financial element (such as trends in grant funding or clinical activity), track the utilization of research space, calculate the break-even cost of research space, and most important, model various “what-if” scenarios to help plan effectively for the future needs of the UMMS.

The strategic reporting system is still being integrated throughout the UMMS, so there has not yet been time to measure the system’s efficacy or its shortcomings. Nevertheless, important lessons have already been learned, which the author presents.

In this article, I describe how, in 2002, the executive leadership at the University of Michigan Medical School (UMMS) embarked on a course designed to change not only the school’s financial structure but its management culture as well. I explain the new financial management approach we have developed, which features three Web-based strategic reporting tools. These tools are already helping the UMMS to manage resources in a completely different way. The article ends with lessons learned so far and reports of early successes.

Back to Top | Article Outline

Introduction

For over 150 years, the UMMS has provided training to doctors and scientists in a wide variety of medical and basic science disciplines. Currently the UMMS has approximately 1,800 faculty, 1,000 housestaff, 680 medical students, 350 graduate students, and a yearly budget close to $1 billion.

The UMMS, like many others across the nation, has historically resembled a collection of silos, with each discipline principally concerned with its own vertical structure, showing less awareness for lateral cohorts or for the school as a whole. Funding for different units was confidential and seemingly discretionary, which led to significant fragmentation of management processes between divisions and internecine rivalry that prevented the UMMS and its clinical and basic science departments from coalescing into an organization with a shared set of goals. Department chairs traditionally had little formal training in leadership and management skills, having been recruited based on their prominence within their disciplines and their interest in administration. The promotion to department chair meant sinking into a morass of financial paperwork that was hard to understand, did not provide timely information for making fundamental decisions affecting faculty and staff, and could not quickly and accurately give a picture of where the department was financially and where it was going. One aspect of the problem was identifying the data that were needed for each department in order to achieve the goals set for the UMMS and its departments. But until recently, there was no way to know where the school stood financially, which made planning on the department, school, and system levels nearly impossible. For example, budget meetings between the dean of the UMMS and individual department chairs were largely taken up in trying to reconcile each parties’ respective set of financial figures instead of using the time to plan for short- and long-term objectives from the same set of financial data. In addition, since chairs could define departmental metrics differently from one department to the next, there was no way to compare metrics from one department to another. Without standardization of definitions across the board, it was difficult to make fully informed management decisions regarding space, personnel needs, and research directions.

These and other problems made it clear that changes were needed, and in 2002, with the dean’s support, I and several members of my senior staff formed a breakout group to create a new financial management approach. (Throughout this article, “we,” “us,” and “our(s)” refer to the breakout group.) This group includes, besides myself, the director, associate director, and assistant director of financial services, the assistant director of information services, the director of faculty affairs, and the director of grants review and analysis, all from the UMMS. After discussing fundamental changes in the financial structure and planning process that I envisioned for the school (these discussions took place over many weeks at the beginning of the process), the breakout group began meeting with department chairs to find out what key indicators each one used to guide decision making in his or her respective departments concerning the allocation of departmental resources. The indicators had to be quantifiable and able to be tracked over time using data gathered by the Medical School. These indicators, such as indirect cost recovery per square foot of laboratory space, eventually became part of a group of key performance indicators (KPIs) that would form the basis of one component of our new strategic reporting system.

One of the most important and difficult parts of planning at a medical school involves trying to figure out how much physical space is needed for clinicians and researchers, both immediately and in one, two, five, 10, and 20 years down the road. An early step for the breakout group was to come up with common definitions of space so we could better understand how well space was being utilized, and put all those involved in planning for their departments on the same playing field. All the physical space in the UMMS was categorized by use: laboratory space, office space, animal research laboratories, leased space, owned space, etc. Removing hallways, closets, and other noncritical-use space left net square feet, which was then characterized as either office space or laboratory space. Departments are charged rent by the UMMS at a fixed rate to cover operating costs (with lab space costing twice as much as office space). Knowing how much and what kind of space an individual department has, using common definitions, gave us the ability to compare direct and indirect dollars per square foot across departments.

The strategic reporting system at the UMMS was specifically developed in 2002 to guide us in deciding, for example, how to respond to requests for incremental laboratory space, how much will be needed as new faculty are added, how the size of grants affects the amount of space allocated to researchers, and what kinds of grants faculty should consider pursuing compared with grants that their cohorts at other institutions might apply for. Before the strategic reporting system was developed, it was impossible to know with any degree of accuracy what grants the UMMS received out of the array of grants available, and what grants our researchers could apply for but routinely didn’t.

We needed to find out if we were using our financial, operational, and physical resources in the most efficient way possible to reach our targets. The overarching management goal for the UMMS is to reach its full potential as an organization, and for the departments to work together as one to become a leaner, more productive, more competitive, organization. To do so, the financial management structure had to change.

The necessary changes included allocating funds to departments by a formula that reinforces growth, instead of funding based on historical precedents. A switch to accrual accounting instead of checkbook accounting provides a more accurate, up-to-date picture of what money has been spent. Annual budgets have been abandoned and in their place we use a five-year rolling forecast. We know that information, instead of being tightly held, has to be shared among departments if it is to be useful. We also need to manage with real-time data that are readily available instead of trying to make decisions based on yesterday’s financial information. To do so, the UMMS developed a strategic reporting system with three parts: M-STAT, M-DASH, and M-ALERT. These three tools are already helping us get exactly the data we need to inform our decision making process to reach our goals.

The data generated by our strategic reporting systems are in a Web-based, user-friendly format and are accessible to department administrators on up. Transparency means everyone has access to the same data, is working from the same set of figures, and has a common understanding about research costs, available resources, and what financial and physical support will be needed to meet the goals set by the school for future growth.

Back to Top | Article Outline

Another Look at Budgeting

Traditionally, the dean of the UMMS held annual budget conferences with department chairs; together they reviewed what had happened the previous year, how much of the previous year’s budget was spent, and what new expenditures might be expected. They then attempted to arrive at a set of goals for the department. The “fixed performance contract” that emerged as a result of this meeting generally echoed the performance of the department in previous years (for better or worse), resulted in incremental budget increases that had little to do with long-term planning or matching expectations for growth with resource availability, and had a “rear-view-mirror” focus.

Since 2002, in place of a traditional budget meeting, the dean and department chairs now meet to discuss each department’s goals. Each department prepares a management discussion and analysis report, which discusses

* strategic initiatives,

* financial trends and issues,

* faculty recruitment, retention, promotion, and retirement,

* educational activities (medical students, house officers, graduate students, service teaching),

* clinical service,

* research activities and directions,

* space use and projected needs,

* progress on focus areas and incentive goals

* different sources of philanthropy.

After the meeting, the dean prepares a letter that outlines areas of focus and goals for both the department and the chair personally.

The UMMS also decided in 2002 to abandon traditional annual budgeting in favor of a five-year rolling budget as discussed in the book Beyond Budgeting.1 The emphasis has changed from making decisions based on data that require months to generate and do not reflect current revenue or spending (akin to navigating an oceangoing vessel by the stars) to one that looks forward to future needs and considers first and foremost the goals which the organization is trying to meet (like navigation using a global positioning system to give up-to-date information needed to keep on course). In the new model, the focus is on managing through the use of key performance indicators.

Back to Top | Article Outline
Key performance indicators and value-added activities

Key performance indicators at the UMMS are organized into three groups:

* Those that represent major areas of focus, which are the UMMS’s financial and management performance and the educational ranking of UMMS students.

* Strategic indicators, which are of particular interest to the school or department, such as patient satisfaction, access to care, and net revenues per work relative value units (RVUs).

* Operational indicators, which reflect day-to-day operations management, such as payer mix trending, number of work RVUs compared to number of expected clinical full-time employees. KPIs are used to monitor progress towards the goals of the UMMS in place of internally negotiated annual targets that did not contribute cohesiveness and community to our organization. We now base evaluations and financial incentives for department chairs and administrators on relative improvement towards the goals we’ve set as an institution.

With regard to performance, we’re trying to think in terms of whether an activity within a department of the UMMS adds value or not to operations and thereby to the school as a whole. A rolling multiyear budget means decision makers can revise their plans based on real-time data and make timely adjustments without having to wait out an annual budget cycle.

Back to Top | Article Outline
Strategic reporting

As mentioned earlier, our strategic reporting system has three components: M-STAT, M-DASH, and M-ALERT (Chart 1) The UMMS developed these three tools to bring together in one virtual place a comprehensive database that provides real-time information for data-driven decision making. All the information contained in the database can be organized into highly customized reports that exclude extraneous material and make it much easier to understand. A user can also create “What if?” scenarios that show how changing one or more facets of data (an increase or decrease in RO1 grants, for example) will affect relevant KPIs. It’s also important to have data that make possible comparative analysis within and across departments, such as comparisons between departments regarding utilization of laboratory space. External benchmarks enable meaningful comparisons with benchmarks of other institutions if the relevant data are publicly available (the National Institutes of Health granting information, for example) and can be accessed and delivered to the user in an up-to-date, easy-to-understand format. We have made it possible for people to get the information they need to make sound business decisions with just a few clicks of the mouse. This allows us to use the most current information and to make solid projections about plans for the future.

Chart 1
Chart 1
Image Tools
Back to Top | Article Outline
M-STAT.

M-STAT is the cornerstone of our suite of strategic reporting tools that the UMMS developed over a three-year period beginning in 2001. It is stored in an Oracle database and merges a comprehensive multiyear database with real-time information in a central location. Some information in M-STAT is updated daily, some weekly or monthly, depending upon data availability and need. Report categories are the faculty profile (a summary of compensation, clinical, research and educational activity), research expenditures and space, research submissions and awards, space utilization, staff workforce metrics, and teaching. A user selects from a pull-down menu of report categories, then chooses a subcategory such as “Research Submissions and Awards” to obtain current information on grant submission status. These selections can be expanded to include the entire UMMS, or narrowed down to a department, division, or to an individual faculty member. Reports are available to users from one of these categories or from combinations of data from each, in PDF or Excel format, which make them easy to read, understand, and print. Users choose search parameters from point-and-click menus to look at the data they are interested in, or they can generate customized reports from the raw data included in the database. Nearly all reports allow drill-downs into departmental and individual faculty detail. As new data become available, reports are updated automatically to make them as much in real-time as possible.

Data at a school-wide and department level are available to all departments. However, because sensitive information regarding individual faculty is contained in the database, access to reports created by M-STAT at an individual faculty level is restricted to the offices of the dean, department chairs, and division chiefs. The data reported on by M-STAT are highly detailed and provide the foundation for building M-DASH.

Back to Top | Article Outline
M-DASH.

I first proposed this part of the reporting system in October 2002; the breakout group had an initial version ready by March 2003 for use by executive management, department chairs, and administrators in the UMMS. It is a transparent tool that provides the same information to anyone using it. M-DASH gives an overview (as compared to the detail of M-STAT), using KPIs to assess the performance of the UMMS. The summary page on our demonstration Web site (described below) highlights a few of the major KPIs that the UMMS is tracking. There are five tabs across the top (Finance, Research, NIH Research, Clinical, and Education) where, through the use of drop-down menus, the user can go to a separate page for any of the KPIs being tracked for that area for a more detailed view.

One of the critical features of M-DASH has been its interactive nature—a user can toggle from a chart of numbers to a graph in order to see financial or activity trends where they might have not been spotted before. M-DASH presents data in real-time, can make comparisons among key indicators from the past, and allows a user to perform “what-if” scenarios. The following Internet link to our summary page gives the user access to a simulation of M-DASH: 〈https://www.umms.med.umich.edu/test.public.demo.dashboard〉. (Once the initial screen appears, enter the word “demo” for both the UMICH ID and password.)

The M-DASH screen and its point-and-click menu make it easy to use. At a glance, viewers can tell whether or not a particular activity is up or down for the current quarter compared to a year ago by directional arrows positioned beside each category. Each category has a menu that lists specific activities within the category. “Finance,” for example, contains two windows, one for profitability and one for liquidity. Either can be looked at on the level of a school, department, or division. Report options are fiscal year and accounting period; comparisons can be made by month, year to date, or year end. The user can look at the operating margin, total margin, and return on net assets for a unit, a cohort unit, or at the school level. These data can be viewed as a table or graph.

To understand the data better, the viewer can click on a bar titled “About This Page” and see call-out boxes for each category on the page that explains in further detail what that item means. Every window in M-DASH is accompanied by a page that explains how the data on that page were derived. Additionally, viewers can drill-down to a more detailed level of data from which the M-DASH overview was constructed for any category.

M-DASH is helpful in forecasting financial changes and identifying trends. For example, the online M-DASH tutorial shows a simulation page from the M-DASH category “NIH Research” titled “Modeling NIH Portfolio.” By pointing to any grant mechanism in the “Change UM ’05 Portfolio” and sliding the button to the left or right, there will be a corresponding change in the NIH Awards status box. The user can also see what funding levels to expect if there was an increase in a particular mechanism. A user can also increase grant mechanisms by either the number of grants received or the size of individual grants, or both. This screen can track grant activity by mechanisms for individual departments or for the UMMS as a whole. It is one example of how department chairs can easily view grant activity in their departments and decide if it is to their benefit to explore funding mechanisms they might not have focused on in the past.

Back to Top | Article Outline
M-ALERT.

The third part of UMMS’s strategic reporting system is called M-ALERT. This application automatically sends an e-mail message to the user when a selected activity falls outside of a preestablished range. For example, a department chair may want to be notified when the percent of faculty salary on grants decreases by a specified amount—5% or more, perhaps—and will be given the exact percent decrease and the time periods within which that occurred. Drill-downs on the dashboard provide more detailed data that allow the user to determine what caused the change, which helps facilitate decisions on what action, if any, needs to be taken.

Although alert e-mails are currently configured for specified KPIs, eventually the user will be able to choose which parameters are important to him or her, and receive alerts about those. M-ALERT is the last part of the strategic reporting system to be developed and is still undergoing testing.

Back to Top | Article Outline

Looking Back, Looking Ahead

Our strategic reporting system is still being integrated throughout the UMMS, so at this time we don’t have outcomes that either demonstrate the system’s efficacy or its shortcomings. Nevertheless, we’ve already learned important lessons from our experience. First of all, we have become more cognizant to the degree in which using a “business world” vocabulary would turn some faculty off. Initially, many faculty members were wary of what metrics and benchmarking would ultimately mean, and hearing business terms bandied about the UMMS like a fait accompli was, and still is, irritating. Some have expressed resentment about what they see as the value of their life’s work boiled down to a metric. We needed a common language that came largely from the competitive marketplace to describe the changes we were making so that we all understood where we were headed. Nevertheless, we’ve learned to be more sensitive, in this traditional academic setting, in how we introduce and use terminology from the corporate setting at the UMMS.

Second, not all parts of the organization are comfortable using metrics and benchmarks. Not every faculty member agrees that analyzing the performance of a medical school based on quantitative measures is beneficial or even necessary. In the UMMS, with an entrenched academic medicine culture naturally resistant to change, restructuring the way the school is managed will take time. As one department chair has pointed out, it’s difficult to use metrics to assess the efficacy of good teaching skills, ethical behavior, and outstanding compassion toward patients, all crucial components of good medical and scientific practice. This conundrum has echoed throughout the institution as administrators, practitioners, and researchers grapple with what it means to work in an academic medical center with a business sensibility regarding cost effectiveness and best use of resources.

Third, it is not easy for an organization as complex as the UMMS to agree on a relatively short set of metrics. It was extraordinarily challenging to derive a set of metrics that represented the most important key indicators of a diversified medical school without excluding others that might have seemed more compelling from an individual department’s point of view. But metrics change, and ours have changed already, notably on the clinical side, where it was clear more metrics were needed to adequately capture the key financial elements of patient care.

We’ve seen some early success as well. At budget meetings the dean and department chairs come with the same sets of data, since they’re all getting it from the same place, M-STAT and M-DASH. And since the data are transparent, the suspicions and intrigue that were once part and parcel of budget decisions are largely gone. In another example, the UMMS has recently been hard hit with enormous utility cost increases that will have a significant impact on the cost of research. A tool in M-DASH can help decide how productivity will need to change in order to absorb these cost increases.

Our strategic reporting system, designed specifically for the UMMS, is still evolving as we rely on it more and more. Other medical schools have expressed strong interest in our new financial management approach; we hope that in the near future we will be able to compare our system with the financial systems of other medical schools that have also decided to use metrics and KPIs in the way we use them or in ways similar to ours.

We’re committed to the changes we have made in how we manage our financial resources and plan for the future. “You can’t change something unless you benchmark and understand trends,” says a department chair, who despite concerns about ensuring the quality of the data being relied upon for decision making, thoroughly endorses M-STAT and M-DASH and a more systematic approach to using resources. The chair added, “This is a great step forward. I really think that using such a system helps keep us honest and may help us shape the future.”

Back to Top | Article Outline

Reference

1 Hope J, Fraser R. Beyond Budgeting. Cambridge, MA: Harvard Business School Publishing Group, 2003.

© 2006 Association of American Medical Colleges

Login

Article Tools

Images

Share