A quality improvement approach to capacity building in low- and middle-income countries : AIDS

Secondary Logo

Journal Logo

Supplement Articles

A quality improvement approach to capacity building in low- and middle-income countries

Bardfield, Joshuaa; Agins, Brucea; Akiyama, Matthewb; Basenero, Apolloc; Luphala, Patiencec; Kaindjee-Tjituka, Francinac; Natanael, Salomoc; Hamunime, Ndapewac

Author Information
doi: 10.1097/QAD.0000000000000719
  • Free
  • Open Access Icon



Building the capacity of the workforce to improve delivery of services is essential to scale up and spread efforts in combating the HIV epidemic in low and middle-income countries (LMICs) and is increasingly recognized as a crucial component to strategic planning in President's Emergency Plan for AIDS Relief (PEPFAR)-supported countries [1]. Developing capacity and sustainability for country-led improvement programs requires engagement as well as knowledge and skills building across the public health system at service delivery, and organizational and national levels. At the national level, leaders must set clear goals and priorities for work across organizational levels. This achievement includes creating an environment in which improvement, learning, communication, teamwork, measurement, reliability, transparency, and safety are standard. Priorities need to be communicated throughout the healthcare system to organizational leaders in healthcare facilities and public health jurisdictions, who play vital roles as champions of improving care and translating national priorities to providers and to the community.

In this study, we will focus on how capacity development for quality management at the national level can be built and sustained using a framework from HEALTHQUAL International (HQI), which can then be harnessed to develop capacity throughout the public healthcare delivery system.

Application and adaptation of systems for managing quality of care using quality improvement methodology is one approach to capacity building for health systems strengthening in low and middle-income countries (LMICs). A national quality management program (QMP) supporting implementation of modern improvement principles [2] requires a multitiered organizational structure and national leadership. To build capacity for quality improvement throughout the healthcare system among healthcare facilities, both training and coaching are necessary – the former to build knowledge of basic concepts, and the latter to facilitate implementation in an adaptive process within the specific context of unique healthcare organizations.

The ultimate goal of HQI as supported through PEPFAR is to institutionalize improvement in the healthcare system. HQI is a capacity-building initiative that facilitates sustainable quality management programs in LMICs. The model for this program began in 1992 as the New York State HIV Quality of Care Program, sponsored by the New York State Department of Health (NYSDOH) AIDS Institute, and later expanded to HIV ambulatory care facilities throughout the United States with support from the Ryan White CARE Program through the Health Resources and Services Administration (HRSA) HIV/AIDS Bureau [3]. The model utilizes data for improvement, with the goal of enhanced patient outcomes and improved population health.

Practical and technical knowledge in specific public health and healthcare domains relevant to a government-led quality management program has been applied at the organizational level and supported by the workforce. The aim of this study is to demonstrate this capacity-building approach and to illustrate how the HQI framework can be adapted to local needs, resources, organizational, and institutional contexts. We use an example of HQI's work in capacity-building with the Namibia Ministry of Health and Social Services (MoHSS) to illustrate this approach.


The HQI framework consists of the following three main elements: performance measurement, quality improvement, and quality management (Fig. 1), and is adaptable to local context to achieve sustainable capacity in national programs through the multiple organizational and workforce levels of the healthcare delivery system and through public health regions [4]. HQI's adaptive implementation – rooted in the Model for Improvement [5] – focuses on the use of improvement frameworks, principles, and methods, rather than formal design, to meet the specific national context and local needs. Whenever possible, implementation is built into existing bureaucratic structures, performance measurement platforms, and the quality improvement initiatives. Further, time of implementation is usually not targeted over a finite period, but rather sequential short-term aims are reframed based on routinely measured data (performance measurement), where individual clinics and facilities select their own area of improvement based on internal or external priorities.

Fig. 1:
. Three elements of the HEALTHQUAL framework.

HEALTHQUAL International partners with national Ministries of Health to develop facility-level systems for collection and analysis of clinical performance data based on indicators of care derived from national guidelines in each country and globally accepted standards of care. Implementing clinics are selected by the Ministry of Health (MOH) in each country and determined based on geographic location, clinic type, size, and level of need to ensure an adequate mix of facility types (hospital, community health center) and geographic distribution (urban/rural). Clinical improvement teams then routinely collect performance data on the basis of the selected indicators, and develop and implement improvement activities informed by their performance measurement results, as well as national and local improvement priorities. Clinics organize their measurement and improvement work into a quality management program consisting of structures, functions, and processes to support this work [3].

The HQI model is adapted in each implementing country, and is facilitated by HQI and other implementing partners through coaching, mentoring, and training of the national in-country team, which gradually assumes greater responsibility for this role. HQI imparts improvement knowledge and skills, on the basis of modern improvement principles [6,7], to support clinical improvement teams throughout the quality improvement project implementation process. HQI staff provide hands-on technical assistance for coaching and mentoring of local quality improvement coaches through in-person and virtual platforms, and during regional quality management group meetings of provider teams where peer exchange accelerates learning.

Case study

Beginning in 2007, with support through PEPFAR and technical assistance from the Centers for Disease Control and Prevention (CDC), the MoHSS and HQI piloted HIVQUAL-Namibia (HQ-N) to establish and support national quality improvement processes for HIV care and treatment. HQI provided technical support for the implementation of a national coaching and mentoring strategy to develop a dedicated team of individuals to execute a quality management program across the national healthcare system.

Implementation of HQ-N began with the engagement of senior leaders in the MoHSS to introduce the concepts of quality improvement and design a work plan for implementation of the program. Discussions were held about prioritization of indicators, data collection systems for measurement, and strategies for training the selected participating healthcare facilities in quality improvement and data collection methods. HQ-N developed specific implementation steps for each of the three components of the HQI model: performance measurement, quality improvement, and quality management. At larger healthcare facilities, a quality improvement committee (QIC), consisting of doctors, nurses, pharmacists, pharmacist assistants, data clerks, and community counselors, was formed. At smaller health centers, the dedicated HIV staff integrated quality improvement into their routine team discussions.

A data collection process was devised to facilitate performance measurement using the selected measures (Table 1). When feasible, the measures were adapted to the existing data collection platform – the electronic performance monitoring system (ePMS). Through this system, a list of eligible patients was generated from which a sample was then taken, calculated using standardized statistical tables, to achieve a 95% confidence interval (CI), with a level of precision of ±8% in each implementing clinic based on a score of 50%. Once the sample was identified, data were manually abstracted from patient paper charts into the HQI software – a Microsoft Access-based software program specifically designed with fields to capture indicator data.

Table 1:
HIVQUAL-Namibia indicators, definitions and improvement (2008–2013).

Once performance measurement data were generated, each facility team analyzed results and targeted specific areas for improvement. This data retrieval process occurred on a biannual basis, and on the basis of these results, quality improvement projects were developed and implemented. In addition, after data verification and validation at the facility level, performance data were sent to the MoHSS where they were further validated and aggregated into a master database to produce national benchmarking reports. National performance scores were reported as an average rate of mean clinic scores by data cycle and then trended longitudinally to identify national priorities. On the basis of these rates, national improvement priorities were identified.

Coaching and mentoring visits were conducted by national and regional program managers for planning, implementation, and evaluation of the quality of HIV services with and without international technical assistance. These visits were intended to occur on a yearly basis; however, over time, they were performed only as needed. Quality improvement methods and strategies were adapted with improvement-focused education to provide hands-on opportunities for rapid uptake of quality improvement skills among clinic staff. At these coaching sessions, performance measurement skills were strengthened, including data collection and reporting. Technical assistance was not only focused on data capture and analysis, but also on data validity and visualization to improve numeracy skills among staff to increase data use.

The MoHSS has demonstrated leadership and support for key peer learning strategies, which are critical to the spread of improvement skills and knowledge for wider implementation. Geographically clustered regional quality management groups and improvement workshops were convened 2–3 times annually by the national program as a mechanism to share improvement strategies and build clinic-level quality improvement skills nationally while improving local knowledge and spreading successful strategies through peer exchange. Workshops included plenary presentations, group discussions, sharing of performance data, drafting quality improvement work plans, and reviewing quality management plans. In this way, best practices were identified that could be adapted in different clinics and hospitals. National program leaders attended these meetings to demonstrate their support for the national quality management program and address policy issues as they emerged.

Facility-level organizational assessments [8] of quality management programs and processes are conducted annually by quality improvement coaches and designed to evaluate organizational progress towards a sustainable quality management program. The organizational assessment is implemented in two ways: by an expert quality improvement coach or as a self–evaluation, and is scored from 0 to 5, with a scoring structure that evaluates program performance in specific domains along the spectrum of improvement implementation. The organizational assessment results are ideally used to develop a work plan for each element with specific action steps and timelines guiding the planning process: to focus on priorities, set direction, and ensure that resources are allocated for the QMP. Results of the organizational assessment should be communicated to internal key stakeholders, leadership, and staff. Engagement of organizational leadership and staff is critical to ensure buy-in across departments, and essential for translating results into improvement practice.

Specific organizational assessment domains include presence of a written quality management plan outlining organizational processes for setting improvement priorities and goals, planning and allocating resources for quality activities, and assigning timeline to achieved desire results [7]; presence of an organizational infrastructure; and the extent to which the quality plan was being implemented and the level of capacity building achieved. Organizational assessments were scored from 0 to 5, and data were analyzed to assess for longitudinal improvement.

Baseline performance data were compared with results from the most recent follow-up period. Statistical significance testing of these comparisons was conducted using paired t tests, with a significance level of 0.05.

The national quality management program is also assessed by a formal national organizational assessment (NOA), which measures the development and implementation of core domains of the program.

HEALTHQUAL International's work in Namibia was approved by the MoHSS, and received nonresearch determination from CDC as a method for routine program evaluation and quality monitoring.


HIVQUAL-Namibia began with 16 initial pilot clinics (wave 1), with first data submission in 2008. An additional 15 clinics (wave 2) began in 2009, with expansion of data collection and quality improvement project implementation to a total of 38 HIV care and treatment facilities in all 34 health districts. Ten consecutive rounds of performance measurement data from healthcare facilities have been submitted between July 2007 and March 2013, representing 37 465 individual patient charts sampled across review periods.

Between 2007 and 2013, the MoHSS, with support from CDC and HQI, conducted 47 quality improvement capacity-building workshops for teams from the participating facilities, resulting in a cumulative total of 1094 healthcare providers and managers attending. National and regional program managers had conducted 124 site visits to participating facilities to provide supervision, coaching, and mentoring to providers at the participating health facilities.

A core package of indicators based, in part, on national antiretroviral therapy (ART) guidelines, was developed. Quality improvement indicators include routine clinic visits, CD4 monitoring, ART prescribing to eligible patients, adherence assessments, tuberculosis (TB) screening and assessment, isoniazid preventive therapy (IPT), Pneumocystis jiroveci (PCP) prophylaxis, prevention education, weight monitoring, food security, and alcohol screening (Table 1).

Table 1 shows aggregate mean improvement in 10 out of 11 indicators between the first reported review period and the last measurement period [9]. Key areas showing significant improvement among HQ-N clinics reporting at baseline and most recent follow-up include prescribing ART to eligible patients (P < 0.00008), ART adherence assessment (P < 0.01), CD4 monitoring (P < 0.03), cotrimoxazole preventive therapy (P < 0.01), weight monitoring (P < 0.03), food security (P < 0.000002), and alcohol screening (0.0002).

HIVQUAL-Namibia implementing clinics tested multiple quality improvement interventions chosen by the teams based on priorities determined through process investigation and root cause analysis to improve care in specific performance measures. Implementation of quality improvement projects has followed an asynchronous implementation approach that varies across clinics on the basis of workforce and resource availability. Collection of improvement project and intervention data is ongoing.

A specific example illustrates these strategies. The food security screening and referral indicator was prioritized with guidance and support from the MoHSS. Additionally, the national quality program introduced several initiatives and policy changes that affected all clinics while the quality measure was being introduced. These included introduction of a national Nutritional Assessment and Counseling Support (NACS) program through PEPFAR, improving documentation of patient information, institutional emphasis on referrals for food insecure patients, development and implementation of a food security screening tool, integrating specific fields for food security screening into the electronic record, patient nutrition education and counseling, and MoHSS-supported community-based food programs. The Ministry also encouraged clinic teams to focus on food security as a priority for their improvement initiatives.

All NOA domains demonstrated initial improvements in the national program (Table 2) [10]. Quality management domains that demonstrated sustained or continued improvement included the development of a comprehensive national HIV-specific quality management plan, selection of appropriate performance and outcome measures, and the outline of a timeline and accountabilities. Improvements in organizational infrastructure, having an HIV-specific quality management committee involving providers, consumers, and representatives from other MOH departments, and a process to evaluate, assess, and follow-up on HIV quality findings and ensure data are used to identify gaps, were all sustained.

Table 2:
Namibia national organizational quality assessment tool (scores by domain and year).

Domains assessing the development of a quality plan and capacity building, including appropriate performance data collection nationwide, the way in which quality improvement projects were conducted, and the availability of nationwide technical assistance all demonstrated ongoing improvement.


In this study, we show how quality improvement programs offer methods of health systems strengthening and capacity-building that operate synergistically at the organizational, regional, and national levels. While these three levels are often distinguished as separate entities, in reality, they are closely intertwined. HQ-N relies heavily on the direct involvement and support from the leadership in the MoHSS; however, the development of regional managerial skills and quality improvement skills at the healthcare delivery interface is also integral to the program's success. This is evidenced by clinics moving beyond basic quality improvement and implementing delivery system changes, such as integrating TB prophylaxis into routine HIV clinic sessions and focusing on food security as critical components of their national HIV care and treatment program.

We have used the case of Namibia to illustrate factors associated with national and local capacity-building within a centralized healthcare delivery system that is embedded in a public health program [11,12]. In Namibia, leadership was essential to successful implementation, specifically through championship of improvement principles and allocation of resources to implement this work. Institutional structures, including measurement systems, technical working groups, and regional quality management groups, enabled implementation, and succeeded particularly by engaging stakeholders in decision-making throughout the processes of indicator selection, site selection, and preparation through training, as well as coaching visits. Successful capacity building for quality management relies on the spread of improvement knowledge, which was supported by institutional structures for coaching, training, and peer learning.

Country ownership of local programs is central to the success of institutionalization of national quality management programs. HQI fosters country ownership in a variety of ways. HQI works within existing structures and processes to build knowledge and expertise in improvement methods, including data collection, analysis, and reporting, as well as process analysis and use of data to improve systems and processes of care. Through targeted skills building in quality improvement, health services personnel learn to use data for improvement and focus their work on achieving better health outcomes in patients. This participatory, data-driven approach engenders and sustains improvements in HIV care and treatment, with consideration for country context and local challenges. Development and fostering of a national coaching team ensure that improvement in education and implementation of methods and principles are taught, modeled, and mentored over time by seasoned improvement coaches. As knowledge for improvement advances and expertise is institutionalized, skills are mentored and spread through clinic visits and regional learning networks that support sustainable implementation.

A second factor that promotes country ownership is that selected performance indicators are generated locally to reflect specific national interests and regional disease processes. Lastly, HQI's methodology is adaptive; therefore, it can be utilized by national Ministries of Health across the healthcare sector. For example, in Namibia, having developed a strong HIV quality improvement program using the HQI model, the MoHSS has integrated a quality improvement unit into their Quality Assurance Division (QAD) and implemented a multidisciplinary Technical Working Group that incorporates the HIVQUAL team into this sector-wide QAD group. Quarterly meetings are scheduled to discuss key issues regarding the quality of health services, and programmatic recommendations are made. Therefore, the HQI model continues to reinforce the HIV strategic plan while simultaneously permitting the evaluation of emerging issues.

Quality improvement is an important strategy for improving service delivery systems. In Namibia, implementation of modern improvement methods has aided facility-based HIV care and treatment teams to identify overlooked areas of care that are key drivers of health outcomes. It has enabled focused attention on problem-solving and fostered a multimodal approach that combines nationally-led public health initiatives, with locally driven improvements, to design interventions that improve both healthcare delivery systems and quality of life for patients. Advanced data systems and more rigorous implementation research will be required to formally link the gains made through improvement methods to the achievement of specific outcomes, including both viral load suppression and the reduction of mortality, which are indirectly and directly attributable to HIV infection.

One of the main limitations of this study is the degree to which a correlation can be drawn between patient outcomes based on aggregated facility scores and organizational assessment findings as a reflection of national capacity. The notion that improved health outcomes as measured by clinical performance data is a marker of capacity-building can only be inferred. In fact, facility-level improvement interventions were not designed to be studied in isolation, but rather implemented as part of an asynchronous approach based on availability of local personnel and resources, and include simultaneous multimodal activities to achieve timely and rapid improvements in patient care and outcomes. Moreover, other technical assistance initiatives were implemented in parallel, such as clinical mentoring and data management technical support initiatives. Since no previous measurement is available, it cannot be determined whether these specific interventions resulted in improvement.

Moreover, the reasons for lower scores on indicators such as food security, alcohol screening, prevention education, and IPT remain to be elucidated. None of these activities had been routinely implemented before the introduction of the HIVQUAL measures, suggesting that the process likely drove the initiation of these activities in the clinical care setting, which nonetheless required time to adapt and integrate into routine HIV care. Additionally, the healthcare service delivery system is complex, dynamic, and is comprised of many integrated components, which include gaps in organization of services, delivery systems, information systems, and community involvement that are amenable to improvement [12]. Other factors involving absence of key inputs, such as personnel or equipment, may affect performance significantly as well, complicating interpretation of results.

One limitation of the statistical analysis is that the duration of performance measurement varied across clinical sites. An initial 16 HQ-N facilities initiated formal quality improvement activities in the first 6 months of 2008, whereas another 15 did so in the first 6 months of 2009. Because we considered all participating sites to have started at baseline and clinic teams were spread widely throughout the country, improvement activities did not start before the beginning of measurement. In all instances, we could not control for other inputs or support that may have occurred at any time, although these inputs would have occurred throughout the country and affected all clinics at the same time, specifically with regard to supplies and equipment purchasing. Aggregating the two groups increased the statistical power of our comparisons and yielded better evidence to determine whether activities were likely to have had a true effect on clinical performance.

A clearer link exists in the relationship between improvements in the national organizational assessment and health system strengthening. However, a similar limitation is that program implementation data have not been captured in a way that we are able to correlate the number of coaching visits with improvement results and their cumulative effect on capacity-building as a whole. In theory, review of organizational assessment results by the MoHSS, with mentoring by HQI, promoted acceleration of improvement sophistication by identifying programmatic strengths and specific gaps that led to targeted coaching and support, and which, in turn, fostered successful improvement interventions in healthcare facilities.

In conclusion, a quality improvement framework, such as the one implemented by HEALTHQUAL, permits engagement of all healthcare sectors, as well as country ownership of the national quality management program, offering a compelling strategy for sustainable capacity building of improvement skills and quality management in LMICs. The Namibia MoHSS has demonstrated leadership and support for key peer learning strategies that are critical to the spread of improvement skills and knowledge for wider implementation. The nation's organizational and healthcare delivery levels have shown that skill and experience in methods of improvement can continue to deepen over time. With plans underway for government-based financial support of positions that will be part of a new quality assurance Division reporting to the Deputy Permanent Secretary, the prospect for a sustainable national quality management program is anticipated. As quality improvement programs are scaled up in LMICs, metrics to assess quantifiable gains in capacity-building and to correlate these gains with measures of technical support are needed.


Conflicts of interest

There are no conflicts of interest.


1. PEPFAR. Fiscal year 2013 country operational plan (COP) guidance [01-15-15]. http://www.pepfar.gov/documents/organization/217761.pdf.
2. Heiby J. The use of modern quality improvement approaches to strengthen African health systems: a 5-year agenda. Int J Qual Healthcare 2014; 26:117–123.
3. Agins BD, Young MT, Ellis WC, Burke GR, Rotunno FF. A statewide program to evaluate the quality of care provided to persons with HIV infection. Jt Comm J Qual Improv 1995; 21:439–456.
4. HIVQUAL Workbook. Guide for quality improvement in HIV care. New York State Department of Health AIDS Institute Health Resources and Services Administration HIV/AIDS Bureau; 2006.
5. Institute for Healthcare Improvement. How to improve [2-06-15]. http://www.ihi.org/resources/Pages/HowtoImprove/default.aspx.
6. Massoud R, Askov K, Reinke J, Franco L, Bornstein T, Knebel E, MacAulay C. 2001. A modern paradigm for improving healthcare quality. QA Monograph Series 1 (1). Bethesda, MD: Published for the United States Agency for International Development (USAID) by the Quality Assurance Project.
7. Langley GL, Moen R, Nolan KM, Nolan TW, Norman CL, Provost LP. The improvement guide: a practical approach to enhancing organizational performance. 2nd ed.San Francisco, CA: Jossey-Bass Publishers; 2009.
8. HEALTHQUAL. HEALTHQUAL organizational assessment [02-02-15]. http://www.healthqual.org/organizational-assessment-0.
9. HEALTHQUAL. HEALTHQUAL performance measurement report [02-02-15]. http://www.healthqual.org/healthqual-performance-measurement-report-2015.
10. HEALTHQUAL. HEALTHQUAL national organizational assessment [02-02-15]. http://www.healthqual.org/national-organizational-assessment-noa.
11. Leatherman S, Ferris TG, Berwick D, Omaswa F, Crisp N. The role of quality improvement in strengthening health systems in developing countries. Int J Qual Healthcare 2010; 22:237–243.
12. Thanprasertuk S, Supawitkul S, Lolekha R, Ningsanond P, Agins BD, McConnel M, et al. HIVQUAL-T: monitoring and improving HIV clinical care in Thailand 2002–08. Int J Qual Healthcare 2012; 24:338–347.

capacity building; quality improvement; total quality management

Copyright © 2015 Wolters Kluwer Health, Inc.