Trick question: What is the most-robust study design for assessing the efficacy of a medical or surgical treatment?
Give yourself half credit if you guessed the high-quality randomized clinical trial (RCT). Claim full credit if you picked the right answer: the meta-analysis of high-quality RCTs.
The reason, of course, is that a well-designed meta-analysis benefits from two qualities that even strong RCTs lack: generalizability and verification that the findings of the source trials were at least in the main, replicable. What works at Harvard may not work in Hartford, Halifax, or Helsinki. (For purposes of this discussion, I’m defining a meta-analysis as a systematic review that pools data.)
And double credit to those of you who caught my cunning use of adjectives in the opener: robust, high-quality, well-designed. Treat yourself to a glass of four-dollar rosé.
But what happens if the meta-analysis is not so well designed, and how would you know? These are not trick questions, and knowing how to answer them can be the difference between well-intentioned clinicians helping patients and harming them.
Meta-analyses about the outcomes of medical or surgical treatments should include only randomized, controlled trials (or high-quality prospective studies) as source material. This generally is the norm at better journals [2, 12]. But this is not the case at all journals, which sometimes will allow authors to pool (that is, meta-analyze) data in systematic reviews about treatments; doing so can badly mislead readers and harm their patients. Caveat lector.
This is no mere methodological trifle. The main kinds of bias that accrue in retrospective and observational surgical research do not offset one another, they’re additive. Selection bias (treatment groups with baseline differences in elements that may affect the outcomes in question), transfer bias (follow-up that is insufficiently long or complete to discern all relevant harms of treatment), and assessment bias (using nonvalidated endpoints when evaluating outcomes) befoul retrospective studies more than randomized trials, and they all tend to make the novel treatment appear better than it actually is . If one pools a population of studies that tends to suffer from those limitations in a meta-analysis, the meta-analysis will tend to amplify those biases and overestimate the benefits of treatment. When the treatment in question carries serious risks—and in surgical journals, it almost always does—this problem can harm patients, or kill them.
Meta-analyses also are cited in subsequent research more than are studies of any other design , a finding that holds true across all of health science research and has for many decades now . So, in terms of downstream harm, a poorly designed observational study is bad, but a sloppy meta-analysis is much worse.
Stated another way, while a good meta-analysis is our best tool, a weakmeta-analysis amplifies bias and gives legs to bad data. So thoughtful readers need to go deep  or deeper  in honing their methodological toolkits to avoid being mislead. Topics like heterogeneity and publication bias would be a part of any “201 course” on this topic. These issues are worth learning about  and sound more intimidating than they actually are. But at a minimum, readers should not waste their time reading meta-analyses about the results of surgical treatments that draw data from retrospective or observational research.
Systematic reviews are a little different than meta-analyses. Systematic reviews use reproducible approaches to search the available evidence, and they outline explicitly the parameters they use to decide which papers to include and exclude. Done right, systematic reviews are as scientific as laboratory research because they feature all the same elements. Each one features clear, testable research questions, replicable methods, results that answer the questions within the parameters they set, thoughtful discussions that present take-home messages within a context of what was known before, and a limitations section on the review itself. But importantly, unlike meta-analyses, they don’t pool data, and so their conclusions should be more qualitative. It’s fair game for them to include well-done retrospective work, with the goal of providing a snapshot of what is known. But because the source material is not as strong as that used by meta-analyses, and because they don’t offer precise point estimates (and confidence intervals around them), they shouldn’t be accorded the same weight. And, of course, they should present their main findings with suitable caveats, and their conclusions generally should be modest.
It’s worth noting that meta-analyses can be done selectively and thoughtfully using study designs other than randomized trials as long as they are not evaluating the effectiveness of surgical treatments. For example, studies about novel diagnostic tests are almost never randomized, and they need not be. (In a well-designed study of a new diagnostic tool, all participants receive both the novel test and a gold standard for that diagnosis; there is no call for randomization.) Certainly, a population of such studies can be meta-analyzed to good effect [4, 6]. Occasionally, a meta-analysis about an operation will focus on complications or risk factors that cannot be randomized or on other unusual endpoints , and journals might decide that the information is worth putting out, though this should always be done with suitable caution. And once in a blue moon, meta-analyses are cobbled together from unusual sources—two that have come along lately here were drawn from registries  and from the control groups of randomized trials —to tell other kinds of important stories that would otherwise go untold. But those are rare birds.
If you’re reading a meta-analysis about the results of a treatment and it includes retrospective source material, put it down. Write a letter to the editor explaining why. If this issue comes up frequently in the journal you’re reading, change the channel. Your patients will better off, and your time will be more profitably spent other ways.
1. Bhandari M, Busse J, Devereaux PJ, et al. Factors associated with citation rates in the orthopedic literature. Can J Surg. 2007;50:119-123.
2. Brand RA. Editorial: CORR
® criteria for reporting meta-analyses. Clin Orthop Relat Res. 2012;470:3261-3262.
3. Dahl OE, Pripp AH. Does the risk of death within 48 hours of hip hemiarthroplasty differ between patients treated with cemented and cementless implants? A meta-analysis of large, national registries. Clin Orthop Relat Res
4. Eriksson HK, Nordström J, Gabrysch K, Hailer NP, Lazarinis S. Does the alpha-defensin immunoassay or the lateral flow test have better diagnostic value for periprosthetic joint infection? A meta-analysis. Clin Orthop Relat Res. 2018;476:1065-1072.
5. Ikonen J, Lähdeoja T, Ardern CL, Buchbinder R, Reito A, Karjalainen T. Persistent tennis elbow symptoms have little prognostic value: a systematic review and meta-analysis. Clin Orthop Relat Res
. Published online December 7, 2021. DOI: 10.1097/CORR.0000000000002058
6. Kuiper JWP, Verberne SJ, Vos SJ, van Egmond PW. Does the alpha defensin ELISA test perform better than the alpha defensin lateral flow test for PJI diagnosis? A systematic review and meta-analysis of prospective studies. Clin Orthop Relat Res. 2020;478:1333-1344.
7. Leopold SS. Editorial: getting the most from what you read in orthopaedic journals. Clin Orthop Relat Res.
8. Oxman AD, Sackett DL, Guyatt GH. Users’ guides to the medical literature. I. How to get started. The Evidence-Based Medicine Working Group. JAMA.
9. Panesar SS, Bhandari M, Darzi A, Athanasiou T. Meta-analysis: a practical decision making tool for surgeons. Int J Surg. 2009;7:291-296.
10. Patsopoulos NA, Analatos AA, Ioannidis JP. Relative citation impact of various study designs in the health sciences. JAMA. 2005;293:2362-2366.
11. Shimozono Y, Seow D, Yasui Y, Fields K, Kennedy JG. Knee-to-talus donor-site morbidity following autologous osteochondral transplantation: a meta-analysis with best-case and worst-case analysis. Clin Orthop Relat Res. 2019;477:1915-1931.
12. Wright JG, Swiontkowski MF, Tolo VT. Meta-analyses and systematic reviews: new guidelines for JBJS. J Bone Joint Surg Am. 2012;94:1537.