Share this article on:

An Evaluation of Provincial Infectious Disease Surveillance Reports in Ontario

Chan, Ellen, MSc; Barnes, Morgan, E., MHSc; Sharif, Omar, MBA

Journal of Public Health Management and Practice: January/February 2018 - Volume 24 - Issue 1 - p 26–33
doi: 10.1097/PHH.0000000000000517
Research Reports: Research Full Report

Context: Public Health Ontario (PHO) publishes various infectious disease surveillance reports, but none have yet been formally evaluated.

Objective: PHO evaluated its monthly and annual infectious disease surveillance reports to assess public health stakeholders' current perception of the products and to develop recommendations for improving future products.

Design: An evaluation consisting of an online survey and a review of public Web sites of other jurisdictions with similar annual reports.

Setting: For the online survey, stakeholder organizations targeted were the 36 local public health units and the Health health ministry in Ontario, Canada.

Participants: Survey participants included epidemiologists, managers, directors, and other public health practitioners from participating organizations.

Main Outcome Measures: Online survey respondents' awareness and access to the reports, their rated usefulness of reports and subsections, and suggestions for improving usefulness; timeliness of select annual reports from other jurisdictions based on the period from data described to report publication.

Results: Among 57 survey respondents, between 74% and 97% rated each report as useful; the most common use was for situational awareness. Respondents ranked timeliness as the most important attribute of surveillance reports, followed by data completeness. Among 6 annual reports reviewed, the median time to publication was 11.5 months compared with 23.2 months for PHO.

Conclusion: Recommendations based on this evaluation have already been applied to the monthly report (eg, focusing on the most useful sections) and have become key considerations when developing future annual reports and other surveillance reporting tools (eg, need to provide more timely reports). Other public health organizations may also use this evaluation to inform aspects of their surveillance report development and evaluation. The evaluation results have provided PHO with direction on how to improve its provincial infectious disease surveillance reporting moving forward, and formed a basis for future work in surveillance product development and evaluation.

Informatics Department (Mss Chan and Barnes) and Infection Prevention and Control Department (Mr Sharif), Public Health Ontario, Toronto, Ontario, Canada.

Correspondence: Ellen Chan, MSc, Informatics Department, Public Health Ontario, 480 University Ave, Ste 300, Toronto, Ontario M5G 1V2, Canada (ellen.chan@oahpp.ca).

The authors express their sincere appreciation to representatives from Ontario's public health units and from the Ontario Ministry of Health and Long-Term Care for providing feedback to inform this evaluation. They thank Dr Natasha Crowcroft and Tina Badiani for their advice and contributions in conceptualizing, implementing, and describing this evaluation. They also thank Marlon Drayton and colleagues from across the Communicable Diseases, Emergency Preparedness and Response Department at Public Health Ontario for their input in the development of the evaluation objectives and methodology.

This evaluation was supported by Public Health Ontario as part of its operational activities.

The authors have no conflicts of interest.

Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal's Web site (http://www.JPHMP.com).

This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.

Evaluations in the area of public health surveillance have typically focused on examining various components of a surveillance system.1–4 Some studies have specifically evaluated the timeliness of surveillance reporting, although the focus has been on the period between disease onset and when cases are reported to public health.5 However, no surveillance evaluations in the literature have focused on the usefulness and timeliness of data analysis and dissemination of findings back to the public health organizations that supplied the data. Furthermore, no evaluations in the literature to date have specifically examined monthly and annual routine infectious disease surveillance reports. We believe that such evaluations can be helpful for improving existing surveillance reports and planning how and when to best disseminate surveillance information so that effective and timely public health action can be taken.

In the province of Ontario in Canada, public health comprises a network of organizations working together to promote health and prevent disease transmission. The Health Protection and Promotion Act establishes the roles and responsibilities of 36 public health units (PHUs), which are associated with urban and rural municipalities.6 The Ontario Public Health Standards and Protocols elaborate on roles and responsibilities, and outline the public health programs and services that PHUs deliver, which include reporting cases of infectious diseases to Ontario's Ministry of Health and Long-Term Care (the ministry) for surveillance purposes.7 While the ministry is responsible for provincial public health policy and programs, Public Health Ontario (PHO), an arms-length agency of the Ontario government, provides scientific and technical advice provincially in several areas, including infectious diseases and surveillance.8

As part of its mandate, PHO produces provincial routine infectious disease surveillance reports, with the intended primary audience being Ontario's PHUs and the ministry. Historically, the scope, content, and approach of 2 such reports had been determined internally: the annual Reportable Disease Trends in Ontario Report (the annual report) and the Monthly Infectious Diseases Surveillance Report (the monthly report; both reports available at http://www.publichealthontario.ca/en/DataAndAnalytics/Pages/DataReports.aspx). Although PHO has received anecdotal feedback about the reports, the organization had not engaged the reports' primary audience to seek their opinions of its routine infectious disease surveillance reports and whether the reports support their work.

The annual report is intended to summarize infectious diseases in Ontario with a focus on the year assessed and compare with historical trends, and to describe the epidemiology of infectious diseases in Ontario for a public health audience, using various surveillance data sources. PHO produced 2 versions of the annual report for 2012: a full-length version in PDF format and an interactive online version. The interactive online version, first produced for 2012, allows readers to view summary information from the full-length report by navigating through an online module instead of scrolling through a traditional document. Both versions of the 2012 annual report were released in December 2014; subsequently, 2013 and 2014 annual reports were produced similarly with 2 versions.

The monthly report contains 5 sections: an article highlighting the provincial epidemiology of 1 infectious disease or related topic each month (“Infectious Disease in Focus”); a section identifying reportable diseases that demonstrate significant increases in incidence, accompanied by an appendix section with detailed counts by disease (“Significant Reportable Disease Activity” and “Appendix—Reportable Diseases,” respectively); a section reviewing infectious disease activity in other jurisdictions with commentary on potential to impact Ontario (“Infectious Disease Activity in Other Jurisdictions”); and a section summarizing concluded investigations where enhanced surveillance directives were issued by the province (“Recently Discontinued Enhanced Surveillance Directives”).

To determine how PHO's current infectious disease surveillance reports can best meet the information needs of Ontario's PHUs and the ministry (the stakeholders), we conducted a formal evaluation on the 2012 annual report (full-length PDF and interactive online versions) and the monthly reports (November 2013 to December 2014 issues). The evaluation objectives were to assess these stakeholders' awareness of and access to the surveillance reports, determine whether and how they use the reports, solicit their feedback on how the reports can be modified to be more useful to them, and develop general recommendations on how future reports can best meet their needs. To our knowledge, this is the first formal evaluation that has been published on routinely produced monthly and annual infectious disease surveillance reports.

Back to Top | Article Outline

Methods

Evaluation design and procedure

As the first evaluation of this type that PHO has conducted, our evaluation considers both processes (eg, the time it takes to produce the reports) and potential outcomes (eg, how the reports are used) pertinent to the reports. Therefore, the evaluation design encompassed formative and summative elements. The evaluation team comprised of 4 PHO staff members and included an evaluation specialist. In taking a participatory approach to evaluation as per the principle of Utilization-Focused Evaluation,9 we identified relevant management and program staff who would be the intended users of the evaluation findings and engaged them over the course of the project. We finalized the evaluation design based on feedback from internal consultation, resources available, and the time frame within which the final evaluation report needed to be completed.

On the basis of the evaluation objectives, the evaluation team developed an evaluation matrix (see Table, Supplemental Digital Content 1, available at: http://links.lww.com/JPHMP/A281, which outlines the evaluation questions along with the accompanying indicators, data collection methods, and the intended analysis). We selected 2 data collection methods: an online survey of stakeholders and a review of publicly available annual surveillance reports from other organizations. Our selection of these data collection methods was influenced by the large number of stakeholders, as well as availability of resources and time for data collection.

Back to Top | Article Outline

Measures of timeliness in annual surveillance reports from other jurisdictions

In February 2015, we reviewed the Web sites of 18 local, provincial/state, and federal public health organizations in Canada and internationally to identify annual surveillance reports that were comparable with PHO's annual report and assess their timeliness to data extraction and publication. We limited our review to traditional annual reports (full-length PDF version), because it has been the most challenging product for PHO to release in a timely manner, and comparable products in other jurisdictions were available for review.

We qualitatively assessed the annual reports identified to determine whether they were comparable with PHO's annual report, based on the following criteria: focused on the period of 1 year, summarized counts and rates for reportable/notifiable diseases with supporting text, and had chapters or sections for each disease and/or disease group. We then reviewed the comparable reports to identify relevant dates or periods of time (the period of data described, earliest listed data extraction date, and the date of report release), including only reports with at least 2 of these 3 dates/periods, which were used to calculate timeliness indicators. We compared the median of these indicators with those similarly determined for PHO's annual report.

Back to Top | Article Outline

Survey participants and sampling

We administered an online survey of PHO's stakeholders through the survey software FluidSurveys (http://www.fluidsurveys.com), selecting this format to efficiently gain a broad understanding of how stakeholder organizations view the reports. To track survey completion, send reminders, and assess response rates, we compiled a list of individuals invited to complete the survey in advance. To compile the survey invitation list, we used existing PHO mailing lists to make initial e-mail contact with 105 communicable disease managers and epidemiologists from the 37 stakeholder organizations (36 PHUs and the ministry) on December 29, 2014. We provided these individuals with a background on the survey and asked them to identify by e-mail within 2 weeks, all colleagues in their organization who have an interest in ID surveillance and should receive a survey invitation, including themselves as appropriate. We used this snowball sampling approach to develop the most appropriate, up-to-date list of contacts. The responses received from this e-mail request made up 94% of the 157 stakeholders who ultimately received a survey invitation. For 5 PHUs that did not respond to our request for contacts, we invited the initial e-mail contacts for those organizations to complete the survey (a total of 10 individuals).

Back to Top | Article Outline

Survey development and administration

Our development of the survey questions was guided by the evaluation questions and the accompanying indicators (see Table, Supplemental Digital Content 1, available at: http://links.lww.com/JPHMP/A281, for the evaluation matrix), as well as a simplified educational tool developed in-house to familiarize the evaluation team with the concept of evaluative thinking in the context of a knowledge-to-action framework.10 , 11 The tool outlines a series of interlinked questions related to awareness, access, relevance, utility, and quality of knowledge products/resources that can assist staff in critically assessing the efficacy of PHO efforts in linking knowledge to action on an ongoing basis.

Four PHO and 5 PHU staff members pilot tested the survey to ensure the questions were understandable and the instructions were clear. Seven PHO staff members also completed technical pilot testing of the online survey. On the basis of these test phases, we made minor modifications before administering the final survey, which consisted of 18 questions (9 multiple-choice questions; 2 each of checklist, Likert scale, and open-ended questions; and 1 each of ranking, binary, and multiple-choice/checklist questions).

The PHO Ethics Review Board reviewed and approved the survey questions and study protocols. To ensure the privacy of the participants, we adjusted the privacy settings for the online survey such that responses could not be associated with individual participants. An information and consent letter preceded the survey questions, which described the study background, objectives, data collection protocols, and the way survey data would be used and disseminated. Informed consent was obtained from all survey participants.

The survey was open from January 14 to February 6, 2015. Following the initial e-mail survey invitation sent on January 14, we sent 3 e-mail reminders to those who had not yet completed the survey before the survey close date. After data collection ended, we downloaded the survey results for analysis and, once the evaluation was completed, removed the collected data from FluidSurveys.

Back to Top | Article Outline

Analysis and development of recommendations

We used Microsoft Excel 2010 to complete descriptive analysis of the collected data. Using the dates collected on comparable annual reports by other public health organizations, we calculated the time to extraction and time to publication in months for each report and calculated median and range for all the reports from other organizations overall. For the survey data, we presented results from the close-ended questions as frequencies and summarized relevant points from free-text responses.

The evaluation team reviewed the results from the analysis of both data sources, in consideration of the relevant indicators and the corresponding evaluation questions, and used these results to inform our development of recommendations for future surveillance reports.

Back to Top | Article Outline

Results

We reviewed the Web sites of 18 public health organizations and found 9 with similar annual reports, excluding 3 of these due to lack of available date information to calculate timeliness. Among the 6 remaining reports, the median time to extraction was 5.9 months and the median time to publication was 11.5 months (Table 1); the comparative times for PHO's 2012 annual report were 10.4 and 23.2 months, respectively.

TABLE 1

TABLE 1

A total of 57 participants completed the entire online survey out of 157 survey invitations sent for a response rate of 36%, taking an average of 17 minutes 30 seconds to complete the survey; we excluded 11 partial responses. The 57 participants represented 70% of the stakeholder organizations of interest (26/37). The roles of the respondents reflected those targeted to complete the survey (Table 2).

TABLE 2

TABLE 2

Overall, the majority of respondents were aware of the monthly report (98%) and the annual report (full-length PDF version) (96%), whereas the awareness level for the annual report (interactive online version) was lower (88%). We observed a similar trend for accessing the reports, with 98% accessing the monthly report, 91% accessing the annual report (full-length PDF version), and 51% accessing annual report (interactive online version) prior to the survey. Ninety-five percent of respondents have accessed the monthly report in the past year, with 37% accessing them at least on a monthly basis. E-mail distribution was both the most common way that respondents became aware of the reports and their most preferred way of being notified about future reports. The PHO Web site and e-mail distribution were the most common ways that respondents accessed the reports. They preferred a PDF format for the monthly report, and there was continued interest in having both versions of the annual report.

Between 75% and 97% of respondents rated the monthly and annual reports as very, moderately, and slightly useful overall, with the monthly report most frequently rated as very useful (Figure 1). The usefulness ratings varied by monthly report section; the most useful sections were “Significant Reportable Disease Activity,” “Appendix—Reportable Diseases,” and “Infectious Disease in Focus,” which were rated as very useful by 75%, 68%, and 67% of respondents, respectively. Fewer respondents rated the “Infectious Disease Activity in Other Jurisdictions” and “Recently Discontinued Enhanced Surveillance Directives” sections as very useful at 46% and 32%, respectively.

FIGURE 1

FIGURE 1

Respondents indicated that they use the reports for situational awareness (90% of respondents for the monthly report and the annual report, full-length PDF version, and 64% for the annual report, interactive online version) and sourcing content for local products (54%-83% of respondents; varies by report), followed in decreasing frequency of use by informing public health action, adapting templates for local products, and for forwarding to others (7%-44% of respondents, depending on the use type and the report). Many respondents also noted that the usefulness of the annual report was affected by the delay in publication following the end of the surveillance year of interest. If the reports were no longer available, 65% of respondents indicated that they did not believe they would be able to obtain the same infectious disease surveillance content elsewhere.

Respondents ranked timeliness as the most important surveillance report attribute, followed by data completeness (Figure 2). However, when we combined medium and high importance rankings, more respondents selected data completeness than timeliness (90% vs 74%).

FIGURE 2

FIGURE 2

The majority of respondents indicated that the reports should continue to be produced at the same frequency (63% for the monthly report and 79% for the annual report), although several respondents recommended making the annual reports timelier to improve their usefulness. Specific free-text suggestions provided by respondents included adjustments to report content and functionality.

Back to Top | Article Outline

Discussion

Overall, the survey respondents were satisfied with many aspects of the annual and monthly reports. In general, respondents were aware of and were accessing the PDF version of the annual report and the monthly report, in addition to perceiving them as useful.

While a high proportion of respondents reported being aware of the online version of the annual report, fewer reported accessing this version; this finding could be attributed to the online version being launched for the first time only a few weeks before the survey was administered. Although the PDF version was released at the same time, the individuals surveyed may have already been familiar with the PDF versions of annual reports from previous years and understood how to use it as part of their work, unlike the new online version. It is possible that these individuals would be more likely to access the online version in future releases after they have a chance to explore and understand how they can use the tool.

The majority of respondents rated the monthly report as very useful overall, although usefulness ratings varied by monthly report section. On the basis of these findings, in January 2016, PHO streamlined the monthly reports going forward to focus on the 3 most highly rated sections and discontinued the 2 least useful sections.

The results also indicated that PHO's stakeholders perceived timeliness as the most important attribute of a surveillance report, followed by data completeness. Several respondents indicated that the length of time it took to produce the 2012 annual report (time to publication of almost 2 years) impacted its usefulness. The results of the timeliness analysis of reports produced by other public health organizations help PHO establish a target for future reports of similar scope: approximately 6 months for data extraction and 12 months for publication. These time frames would help ensure that reports not only are timely, and therefore useful, but also allow for confirmation of data completeness locally before extracting the data for provincial analysis.

An interesting finding was that almost two-thirds of respondents indicated that they would not be able to obtain relevant infectious disease surveillance information required if these reports were no longer available. This is interesting because there are in fact several other products and tools available to PHO's stakeholders that summarize similar information, which the respondents seemed to be unaware of. For example, Query is a PHO-developed tool that was launched in July 2014 and allows registered users to dynamically explore reportable infectious disease data that are updated on a weekly basis. This tool provides more up-to-date data but lacks the descriptive content and interpretation included in the monthly and annual surveillance reports.

With this evaluation's findings in mind, PHO is currently seeking to develop a strategy for future improvements to its suite of surveillance products as a whole. Further analysis of the survey results by respondent role, along with further stakeholder engagement activities such as focus groups and evaluations, would be helpful in informing such strategic work. Questions to consider in developing the strategy may include, for example, whether an annual report with less interpretation would be different from Query and whether both products are needed. Ultimately, the resulting strategy should outline how PHO will provide and promote a complementary suite of infectious disease surveillance products moving forward that will best support stakeholders' needs in the most useful and efficient manner.

Although PHO has conducted previous evaluations, this was the organization's first formal evaluation of infectious disease surveillance reports. This was the primary limitation faced by the evaluation team, with no baseline data from prior evaluations within or outside the organization to assist in developing evaluation questions and indicators, or to compare with the results from this evaluation. However, we anticipated that future evaluations at PHO would build upon this initial experience.

Another challenge was that the preliminary evaluation results were expected by spring 2015, which meant we had to make strategic choices to ensure that timeline was met (eg, the survey was administered shortly after the distribution of the annual report). With more time, the survey responses specific to the annual report may have been more complete or different. The timeline also limited the opportunity for cross-referencing and validating data gathered from multiple sources using a mix of data collection methods.

Within the constraints of this evaluation, we did not gain a full understanding of all groups and individuals who access the surveillance reports; the survey only included audiences from PHUs and the ministry. Therefore, the survey results may not be generalizable to other stakeholders, such as other government ministries and agencies, as well as health care providers who may access and use these reports; it would be informative to include these groups in future evaluations. However, the results have allowed us to understand how the reports were being used by the surveyed groups and how their utility can specifically be enhanced for these audiences, as well as providing a basis for future work.

Back to Top | Article Outline

Implications for Policy & Practice

  • The evaluation findings have become the basis of recommendations to enhance provincial infectious disease surveillance reports in Ontario moving forward to better serve the intended audiences, some of which have already been implemented.
  • The results also highlighted the need for further stakeholder consultation to gain insight into what specific changes can be made, as well as the need to consider PHO's suite of surveillance products as a whole when planning future improvements.
  • Since the completion of this evaluation, several new evaluations have been initiated at PHO using this evaluation's approach as a guide.
  • To better serve their respective stakeholders or intended users, the methodology and findings of this evaluation may also be referenced by other public health organizations when evaluating similar products in the future or when consulting with stakeholders before and during product development.
  • For example, a similar ranking approach could be used to determine the most important surveillance report attribute for their stakeholders (which for PHO was timeliness). If timeliness is found to also be the most important attribute, a practical time frame can be set for publishing reports that would be achievable in the organization's unique public health environment; because of differences in public health organizations and reporting networks, the specific time frames determined for PHO may not be applicable elsewhere.
Back to Top | Article Outline

Conclusion

In this evaluation, we sought feedback from the intended primary audiences for 2 of PHO's infectious disease surveillance reports. The evaluation provided valuable insights into awareness, access, and use of the annual and monthly reports from the primary audience, in addition to report production timelines to aspire to. Recommendations from the evaluation have been used to improve these reports and to inform the development of other PHO products, while the evaluation methodology and findings have formed a basis for future work in surveillance product development and evaluation.

Back to Top | Article Outline

References

1. World Health Organization. Communicable disease surveillance and response systems: guide to monitoring and evaluating. http://http://www.who.int/csr/resources/publications/surveillance/WHO_CDS_EPR_LYO_2006_2/en/. Published 2006. Accessed August 17, 2015.
2. Health Canada and the Public Health Agency of Canada, Office of Evaluation. Evaluation reports. http://http://www.phac-aspc.gc.ca/about_apropos/evaluation/evaluation-eng.php. Updated December 2, 2015. Accessed July 7, 2016.
3. Public Health Agency of Canada/Health Canada, Evaluation Directorate. Evaluation of the surveillance function at the Public Health Agency of Canada, 2013. http://http://www.phac-aspc.gc.ca/about_apropos/evaluation/reports-rapports/2012-2013/sf-fs/index-eng.php. Updated May 14, 2013. Accessed August 17, 2015.
4. Public Health Agency of Canada, Evaluation Services. Evaluation of food-borne enteric illness prevention, detection and response activities at the Public Health Agency. http://http://www.phac-aspc.gc.ca/about_apropos/evaluation/reports-rapports/2011-2012/feipdra-pdimeoa/index-eng.php. Updated September 14, 2012. Accessed August 17, 2015.
5. Jajosky RA, Groseclose SL. Evaluation of reporting timeliness of public health surveillance systems for infectious diseases. BMC Public Health. 2004;4:29.
6. Ontario. Health Protection and Promotion Act, R.S.O. 1990, c. H.7. http://www.ontario.ca/laws/statute/90h07. Updated August 31, 2015. Accessed August 22, 2016.
7. Ministry of Health and Long-Term Care. Ontario Public Health Standards. http://http://www.health.gov.on.ca/en/pro/programs/publichealth/oph_standards/default.aspx. Accessed August 22, 2016.
8. Public Health Ontario. Our organization. http://http://www.publichealthontario.ca/en/About/Pages/Organization.aspx. Accessed August 22, 2016.
9. Patton MQ. Essentials of Utilization-Focused Evaluation. Thousand Oaks, CA: Sage Publications; 2011.
10. Patton MQ. Evaluation Flash Cards: Embedding Evaluative Thinking in Organizational Culture. St, Paul, MN: Otto Bremer Foundation. http://http://www.ottobremer.org/sites/default/files/fact-sheets/OBF_flashcards_201402.pdf. Published February 2014. Accessed September 25, 2015.
11. Graham ID, Logan J, Harrison MB, et al Lost in knowledge translation: time for a map? J Contin Educ Health Prof. 2006;26(1):13–24.
12. BC Centre for Disease Control, Communicable Disease Prevention and Control Services (CDPACS). British Columbia annual summary of reportable diseases, 2013. http://http://www.bccdc.ca/NR/rdonlyres/D8C85F70-804C-48DB-8A64-6009C9FD49A3/0/2013CDAnnualReportFinal.pdf. Published September 3, 2014. Accessed October 29, 2015.
    13. Florida Department of Health, Bureau of Epidemiology. Florida morbidity statistics report, 2012. http://http://www.floridahealth.gov/diseases-and-conditions/disease-reporting-and-management/disease-reporting-and-surveillance/data-and-publications/_documents/2012-fl-msr.pdf. Published December 2013. Accessed October 29, 2015.
      14. Manitoba Health, Healthy Living and Seniors. Manitoba annual summary of communicable diseases, 2013: January 1, 2013 to December 31, 2013. http://http://www.gov.mb.ca/health/publichealth/surveillance/cds/docs/2013.pdf. Accessed October 29, 2015.
        15. Nova Scotia Health and Wellness. Notifiable diseases in Nova Scotia: 2013 surveillance report. Population health assessment and surveillance. http://novascotia.ca/dhw/populationhealth/documents/Annual-Notifiable-Disease-Surveillance-Report-2013.pdf. Accessed October 29, 2015.
          16. Toronto Public Health. Communicable Diseases in Toronto 2012. Toronto, ON, Canada: City of Toronto; 2012.
            17. Adams DA, Jajosky RA, Ajani U, et al Centers for Disease Control and Prevention. Summary of notifiable diseases—United States, 2012. MMWR Morb Mortal Wkly Rep. 2014;61(53):1–121. http://www.cdc.gov/mmwr/preview/mmwrhtml/mm6153a1.htm. Accessed October 29, 2015.
              Keywords:

              epidemiologic surveillance; evaluation studies; infectious disease; Ontario; reports

              Supplemental Digital Content

              Back to Top | Article Outline
              Copyright © 2018 Wolters Kluwer Health, Inc. All rights reserved.