Secondary Logo

Journal Logo

Results availability for analgesic device, complex regional pain syndrome, and post-stroke pain trials

comparing the RReADS, RReACT, and RReMiT databases

Dufka, Faustine L.a; Munch, Troelsa,b; Dworkin, Robert H.c; Rowbotham, Michael C.a,*

doi: 10.1016/j.pain.0000000000000009
Research Paper
Open
SDC
Global Year

Evidence-based medicine rests on the assumption that treatment recommendations are robust, free from bias, and include results of all randomized clinical trials. The Repository of Registered Analgesic Clinical Trials search and analysis methodology was applied to create databases of complex regional pain syndrome (CRPS) and central post-stroke pain (CPSP) trials and adapted to create the Repository of Registered Analgesic Device Studies databases for trials of spinal cord stimulation (SCS), repetitive transcranial magnetic stimulation (rTMS), and transcranial direct current stimulation (tDCS). We identified 34 CRPS trials, 18 CPSP trials, 72 trials of SCS, and 92 trials of rTMS/tDCS. Irrespective of time since study completion, 45% of eligible CRPS and CPSP trials and 46% of eligible SCS and rTMS/tDCS trials had available results (peer-reviewed literature, results entered on registry, or gray literature); peer-reviewed publications could be found for 38% and 39%, respectively. Examining almost 1000 trials across a spectrum of painful disorders (fibromyalgia, diabetic painful neuropathy, post-herpetic neuralgia, migraine, CRPS, CPSP) and types of treatment, no single study characteristic consistently predicts unavailability of results. Results availability is higher 12 months after study completion but remains below 60% for peer-reviewed publications. Recommendations to increase results availability include supporting organizations advocating for transparency, enforcing existing results reporting regulations, enabling all primary registries to post results, stating trial registration numbers in all publication abstracts, and reducing barriers to publishing “negative” trials. For all diseases and treatment modalities, evidence-based medicine must rigorously adjust for the sheer magnitude of missing results in formulating treatment recommendations.

Fewer than half of registered complex regional pain syndrome, post-stroke pain, and analgesic device trials have available results. No single study characteristic predicts unavailability. Practical remedies are available.

aResearch Institute, California Pacific Medical Center, San Francisco, CA, USA

bDepartment of Anaesthesia, Centre of Head and Orthopaedics, Rigshospitalet, Copenhagen University Hospital, Copenhagen, Denmark

cDepartment of Anesthesiology, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA

Corresponding author. Address: California Pacific Medical Center Research Institute, 475 Brannan St, Suite 220, San Francisco, CA 94107, USA. Tel.: (415) 600-1750; fax: (415) 600-1725. E-mail address: rowbotm@cpmcri.org (M. C. Rowbotham).

Sponsorships or competing interests that may be relevant to content are disclosed at the end of this article.

Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal's Web site (www.painjournalonline.com).

Received July 12, 2014

Received in revised form September 01, 2014

Accepted October 16, 2014

Back to Top | Article Outline

1. Introduction

The value of evidence-based medicine rests on the assumption that treatment recommendations are robust, free from bias, and include results of all randomized clinical trials. However, publication bias and other types of reporting bias remain prevalent, and selective reporting of clinical trial results has been demonstrated to produce unrealistic estimates of drug effectiveness and the risk–benefit ratio.5,13,27,28,35,38,44,45,47,48,50,51,54,56,59 Patients with chronic pain, and their treatment providers, have the right to expect transparency in clinical trials research.

Clinical trial registration provides public access to basic trial information, planned outcome measures, and data analysis plans. Many registries enable links to results publications, but often the links are either missing or incorrect, and few publications include trial registration numbers.4,15–17,30,53,55,60 Direct posting of results is possible on the largest registry, ClinicalTrials.gov (CTG), and while not peer reviewed, can be sufficiently standardized to facilitate meta-analyses. Depositing results on CTG is required for certain types of studies, but compliance is low.4,7,10,12,18,23–25,30,33,36,37,49,60

The Repository of Registered Analgesic Clinical Trials (RReACT) was developed to provide a global snapshot of registered clinical trials and a scorecard for public availability of results for post-herpetic neuralgia (PHN), diabetic peripheral neuropathy (DPN), and fibromyalgia (FM).12,30 The global RReACT methodology has also been applied to create the Repository of Registered Migraine Trials (RReMiT).4 Disorders covered in the RReACT and RReMiT databases are frequently studied in industry-sponsored clinical trials designed to test new therapies.

We hypothesized that disorders less commonly studied in new drug development efforts, and studies evaluating analgesic devices, might differ substantially in transparency. Two disorders that are associated with severe and refractory pain but with little industry drug development effort are complex regional pain syndrome ([CRPS]; type 1 and type 2 differ by nervous system injury) and central post-stroke pain (CPSP). Described more than a century ago, both remain endlessly challenging from mechanistic and therapeutic viewpoints.

Issues of transparency and selective reporting are particularly important in invasive procedures and complex devices in the field of neuromodulation.42 Development of new medical devices for chronic pain is regulated very differently than new drug development.32,41 Spinal cord stimulation (SCS), a Food and Drug Administration (FDA)–approved class II device to relieve severe intractable pain,9 is usually a 2-step procedure consisting of an initial trial of percutaneous lead placement followed by permanent implantation if deemed successful. Spinal cord stimulation has been a treatment option for over 4 decades, and the technology is improving rapidly.46 Two noninvasive techniques of brain stimulation have emerged in the past decade: repetitive transcranial magnetic stimulation (rTMS) and transcranial direct current stimulation (tDCS).31,34 Repetitive transcranial magnetic stimulation is currently a class II device approved by the FDA for the treatment of major depression8 but is used off-label to evaluate its effectiveness to treat chronic pain. Transcranial direct current stimulation has not yet been approved. We therefore hypothesized that analgesic device studies might also be substantially different in transparency from drug trials.

To evaluate trial transparency in CRPS, CPSP, and analgesic device research and compare the results with previous studies, creation of new RReACT-type databases was necessary.

Back to Top | Article Outline

2. Methods

In this study, the RReACT methodology previously developed for PHN, DPN, and FM was applied to extend RReACT by creating new databases for CRPS and CPSP. A similar methodology was applied to create 2 new databases: the Repository of Registered Analgesic Device Studies (RReADS) databases. One covers trials of SCS and the other covers 2 types of external noninvasive transcranial stimulation, rTMS and tDCS. As part of the Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks (ACTTION) public–private partnership with the U.S. FDA, RReACT, RReMiT, and RReADS are freely accessible through the ACTTION Web site (http://www.acttion.org/).

Trials in RReACT are randomized and have a primary (or key secondary) outcome measure assessing drug efficacy. Trials of nutritional supplements and nontraditional medications are included. Trials in RReADS are prospective trials testing SCS, rTMS, and tDCS in patients with pain of various etiologies. Nonrandomized, observational, and/or single-group studies are included, but retrospective studies and trials including only healthy volunteers are excluded.

The RReACT and RReADS provide registry information on investigational drug/device name(s), drug route and mechanism of action or device stimulation site and parameters, secondary identifiers, study sponsor, study phase, start and completion dates, countries of enrollment, number of subjects, design summary, comparison groups, and primary/key secondary outcomes. Trial status is listed as actively recruiting, active but not recruiting, terminated, completed, unknown, or other. Trials listed as “active but not recruiting” included trials “not yet open for recruitment” and trials “active but no longer recruiting.” Trials listed as terminated included trials “withdrawn before subject enrollment” and trials “terminated after beginning enrollment.” Data collection took place in July 2013 for CRPS and CPSP and in February 2014 for SCS, rTMS, and tDCS.

The World Health Organization's International Clinical Trials Registry Platform (ICTRP) (http://apps.who.int/trialsearch/Default.aspx/) provides a single public-access search portal to 15 primary registries, including CTG. All 15 registries follow international standards for clinical trial registries, which largely coincide with International Committee of Medical Journal Editors requirements.57,58 As of August 2014, CTG is global and is the largest registry, with more than 173,000 trials, and the EU Clinical Trials Register is the second largest, with more than 23,000 trials.

For this study, all 15 primary registries were searched through the ICTRP search portal, with the CTG registry also searched separately, for the 2 target disorders and the 2 types of devices. All trials found were examined manually. If the same trial was listed in 2 or more registries, it was considered to be multiply-registered and only analyzed once. Results were sought for all trials except for those shown as actively recruiting, withdrawn before subject enrollment, or not yet open for recruitment. A comprehensive search algorithm was followed. If links or citations of journal publications were provided on the registry record, they were manually checked to confirm correct pairing with the registered trial. If none was available, a manual search of PubMed using the trial name, drug name, key words from the study title, registry identifiers, principal investigator name, and other trial information was conducted. The gray literature was searched using Google, Google Scholar, and sponsor-related Web sites. To ensure accurate registry–results pairings, we relied on all available trial information, including registry data on comparison groups, sample size, principal investigator, and study dates. Available trial-specific efficacy endpoint results are categorized as peer-reviewed journal article, data entered on registry, or gray literature. Results from the highest-level source are summarized, with peer-reviewed articles ranking highest and gray literature lowest. Only journals available through PubMed were considered peer reviewed. PubMed comprises more than 24 million citations for biomedical literature from MEDLINE, life science journals, and online books. Google Scholar and Google searches may pick up journals that are not indexed on MEDLINE. Each journal's article review policy was not separately assessed to confirm that peer review takes place before publication. Most journals indexed for PubMed are peer reviewed or refereed, but peer review criteria and reviewer or referee qualifications vary. The U.S. National Library of Medicine does not maintain a list of peer-reviewed or refereed journals in PubMed, and PubMed searches cannot be limited to peer-reviewed journals. Separate searches of databases such as Scopus, which does not include the exact same journals as PubMed, were not conducted and could have turned up additional publications (including non–peer-reviewed and non–English language articles) from journals not indexed on PubMed.

Back to Top | Article Outline

3. Results

3.1. RReACT-CRPS and RReACT-CPSP

As of July 2013, there were 34 trials for CRPS (Table 1) and 18 trials for CPSP (Table 2) meeting criteria for inclusion. The RReACT-CRPS database and the RReACT-CPSP database are supplemental files for this article (see appendices). A total of 31 trials were registered on CTG (16 CRPS and 15 CPSP). The other 21 trials (18 CRPS and 3 CPSP) were listed exclusively on 1 or more of ICTRP's other registries. Fourteen trials were multiply-registered; 10 for CRPS and 4 for CPSP. Figure 1 shows the number of registered trials initiated each year for the 2 disorders. Years 2005 and 2006 brought the greatest number of new trials for CRPS, and 2012 brought the greatest number of new trials for CPSP.

Table 1

Table 1

Table 2

Table 2

Figure 1

Figure 1

For CRPS, 3 of the 34 total trials were actively recruiting participants. Results were sought for the remaining 31 trials (17 trials listed as completed, 5 trials terminated after beginning enrollment, 7 trials listed as active but not recruiting, and 2 trials of unknown status). For CPSP, 7 of the 18 total trials were actively recruiting participants. Results were sought for the remaining 11 trials (9 trials listed as completed, 1 trial listed as active but not recruiting, and 1 trial of unknown status).

Thirty-five percent of CRPS trials (11/31) and 73% of CPSP trials (8/11) had available results. Twenty-nine percent of CRPS trials (9/31) and 64% of CPSP trials (7/11) had results in a peer-reviewed journal. Forty-four percent (7/16) of publications for CRPS and CPSP were linked directly to the registry, and the remaining 56% were found by manually searching through PubMed. For CRPS, 1 trial had results available through direct posting on CTG and 1 had results available only in the gray literature. For CPSP, 1 trial had results available through direct posting on CTG. Central post-stroke pain was significantly more likely to have results available of any kind compared with CRPS (Fisher's exact test; P = 0.043), but not when considering the availability of peer-reviewed results (Fisher's exact test; P = 0.070).

Focusing on those CRPS trials where the registry entry showed the trial as “completed” and with a specific completion date, results could be found for 10 of the 15 trials (67%) completed at least 12 months before our data collection, of which 9 (60%) were in peer-reviewed journals. The 1 CRPS trial completed <12 months before our data collection did not have available results. For CPSP, results could be found for 7 of the 8 trials (88%) completed at least 12 months before our data collection, of which 6 (75%) were in peer-reviewed journals. No CPSP trials were completed <12 months before our data collection. Although the total number of trials with a specified completion date is very small, CRPS and CPSP were not significantly different for availability of any results or results in peer-reviewed journals.

Four of the 31 CRPS trials eligible for a results search had an industry primary sponsor. Of these trials, 2 had results available (50%), but none in the peer-reviewed literature. The remaining 27 had a nonindustry primary sponsor, of which 9 had results available (33%), all in the peer-reviewed literature. Six of the 11 CPSP trials eligible for a results search had an industry primary sponsor. All 6 had available results, of which 5 (83%) were in the peer-reviewed literature. The remaining 5 had a nonindustry primary sponsor, of which 2 had results available (40%), all in the peer-reviewed literature.

Back to Top | Article Outline

3.2. RReADS-SCS and RReADS-rTMS/tDCS

As of February 2014, there were 72 trials for SCS (Table 3) and 92 trials for rTMS/tDCS (Table 4) meeting criteria for inclusion in the RReADS database. The RReADS-SCS and RReADS-rTMS/tDCS databases are supplemental files for this article. A total of 131 trials were registered on CTG (55 SCS and 76 rTMS/tDCS). The other 33 trials (17 SCS and 16 rTMS/tDCS) were listed exclusively on 1 or more of ICTRP's 14 other registries. Only 3 trials were multiply-registered, all for SCS. As shown in Figure 1, 2012 brought the greatest number of new trials for SCS and 2013 had the greatest number of new trials for rTMS/tDCS.

Table 3

Table 3

Table 4

Table 4

For SCS, 18 of the 72 total trials were actively recruiting participants, 4 were withdrawn before subject enrollment, and 1 was not yet open for recruitment. Results were sought for the remaining 49 trials (22 trials listed as completed, 12 trials terminated after beginning enrollment, 8 trials listed as active but not recruiting, and 7 trials of unknown status). For rTMS/tDCS, 34 of the 92 total trials were actively recruiting participants, 2 were withdrawn before subject enrollment, and 12 were not yet open for recruitment. Results were sought for the remaining 44 trials (25 trials listed as completed, 1 trial terminated after beginning enrollment, 5 trials listed as active but not recruiting, and 13 trials of unknown status).

Forty-five percent of SCS trials (22/49) and 48% of rTMS/tDCS trials (21/44) had available results. Thirty-three percent of SCS trials (16/49) and 45% of rTMS/tDCS trials (20/44) had results in a peer-reviewed journal. Fourteen percent (5/36) of publications for rTMS, tDCS, and SCS trials were linked directly to the registry, and the remaining 86% (31/36) were found by manually searching through PubMed. For SCS, 2 trials had results available through direct posting on CTG and 4 had results available only in the gray literature. For rTMS/tDCS, 1 trial had results available through direct posting on CTG. There was no significant association between the type of device and the availability of results.

Focusing on those SCS trials where the registry entry showed the trial as “completed” and with a specific completion date, results could be found for 10 of the 18 trials (56%) completed at least 12 months before our data collection, of which 8 (44%) were in peer-reviewed journals. There were 3 SCS trials completed <12 months before our data collection, and only 1 had any available results (not peer reviewed). For rTMS/tDCS, results could be found for 14 of the 21 trials (67%) completed at least 12 months before our data collection, of which 13 (62%) were in peer-reviewed journals. Two rTMS/tDCS trials were completed <12 months before our data collection, and none had any available results. There was no significant association between the type of device and the availability of results for completed trials with a specified completion date.

Twenty-two of the 49 SCS trials eligible for a results search had an industry primary sponsor. Of these trials, 10 had results available (45%), of which 8 (36%) were in the peer-reviewed literature. The remaining 27 had a nonindustry primary sponsor, of which 12 (44%) had any available results, with 8 (30%) in the peer-reviewed literature. Only 2 of the 44 rTMS/tDCS trials eligible for a results search had an industry primary sponsor, and both had results available in the peer-reviewed literature. The remaining 42 had a nonindustry primary sponsor, of which 19 (45%) had any available results, with 18 (43%) in the peer-reviewed literature.

Back to Top | Article Outline

3.3. Comparison of RReADS, RReACT-CRPS, and RReACT-CPSP with RReACT and RReMiT

A total of 763 trials were eligible for a results search (391 PHN/DPN/FM, 237 migraine, 42 CRPS/CPSP, and 93 SCS/rTMS/tDCS). Irrespective of time since study completion, Figure 2 shows the number of trials eligible for a results search, the number with available results, and the number with results in the peer-reviewed literature for all 6 disorders (PHN, DPN, FM, migraine, CRPS, and CPSP) and both types of devices (SCS and rTMS/tDCS). In the RReMiT database, 55% of 237 eligible trials had any results available and 45% had peer-reviewed results available, irrespective of time since study completion.4 In the RReACT databases for PHN, DPN, and FM, 46% of 391 eligible trials had any results available and 30% had peer-reviewed results available, irrespective of time since study completion.30 In the RReACT databases for CRPS and CPSP, 45% of 42 eligible trials had any results available and 38% had peer-reviewed results available. In the RReADS databases for SCS, rTMS, and tDCS, 46% of 93 eligible trials had any results available and 39% had peer-reviewed results available.

Figure 2

Figure 2

Focusing on just those trials with a specified completion date at least 12 months before data collection, the percentages rise as follows (any results/peer-reviewed results): CRPS 67/60, CPSP 88/75, SCS 56/44, and rTMS-tDCS 67/62. Comparison data are available from the RReMiT database, where the percentages (irrespective of study completion date) rise from 55% for any results and 45% for peer-reviewed results to 70% for any results and 57% for peer-reviewed results at 12 months after study completion.4 Pooling the data sets yields 225 total trials with a specified completion date at least 12 months before data collection, and the percentage with any results is 69% and with peer-reviewed results is 57%. Extending the time window to studies with a specified completion date at least 2 years before data collection, the proportions show little further change for CRPS, CPSP, SCS, rTMS/tDCS, and migraine. Comparing the RReADS databases, all RReACT databases, and the RReMiT database, there were no significant differences in terms of results availability or proportion with peer-reviewed publications.

Back to Top | Article Outline

4. Discussion

In evidence-based medicine, randomized controlled trials are the gold standard for establishing the safety and efficacy of an intervention. However, the current infrastructure of evidence-based medicine—the levels of evidence and grades of recommendation—is not necessarily generalizable to evaluating invasive interventions such as SCS. When performing research on non-pharmacological interventions, challenges related to randomization procedures and the use of blinding arise, and randomized controlled trials are not always feasible.1,6,22,26,29,40,43 Despite the expense of new technologies, rigorously controlled trial designs assessing basic efficacy have not been required for SCS device approvals after the advent of clinical trial registries. Spinal cord stimulation trial designs providing an appropriate double-blind control are problematic because the induction of paresthesias in the area of pain is part of the therapeutic assessment, but the absence of masking can lead to several types of bias.6,14 Device studies of rTMS and tDCS can be more rigorously controlled because the devices are noninvasively applied to the skull, and masking noises, in combination with mimicking the cutaneous sensation and muscular discomfort caused by these types of devices, can effectively blind the wearer to whether or not the device has been turned on.1,40,43 Full double-blinding may be possible if all assessments are made by an independent person not involved in operating the device.

Because of the methodological and practical constraints associated with device-based research, we included in the RReADS databases prospective studies that were nonrandomized or observational, as these types of studies are often used when a randomized controlled trial is not feasible. In nonrandomized trials, a rigorous prospective design and focused data collection can reduce bias caused by incomplete data or unmasked outcome assessment.6 Although beyond the scope of this project, our impression of the trials in the RReADS database suggests that many registry trial records contain vaguely described study designs, multiple exploratory endpoints without a clearly specified primary outcome, and non-standard measures only indirectly assessing pain. Pre-specified statistical analysis plans were often either missing or minimal. An ACTTION systematic review comparing registered and published primary outcome specifications in the analgesic trials contained in the initial version of RReACT found many discrepancies.44 In the RReADS database, for some trials, the lack of clarity in design description on the registry was such that we were unable to confirm the accuracy of the trial-publication pairing (eg, SCS trials ACTRN12612000350820 and ISRCTN33292457).

Kessler et al.21 suggest that off-label use of an approved medical device allows clinicians to uncover new uses in an experiential manner, but cautions that therapeutic procedures using established devices for new indications do not always receive systematic rigorous evaluation. The threat of publication bias and selective outcome reporting is particularly great in the field of neuromodulation and intervention-based research, where advances in the field have historically relied on case reports or case series, and issues revolving around study design, blinding, and bias may be unresolvable.6,14,29,42 Ergina et al.6 suggest that for procedure-based interventions, a distinction should be made between explanatory trials, which evaluate the efficacy of the intervention, and pragmatic trials, which assess how the procedure is administered in clinical practice and seek to inform clinical decision-making. Many trials in the RReADS databases, especially of SCS, had primary outcomes other than analgesia, and seemed to be directed toward how best to deliver the intervention or were evaluating the technical performance of new technologies. The decision to permanently implant an SCS device is made based on the clinical response to temporary lead placement. Spinal cord stimulation trials either recruit patients who have already had a device implanted, or provide efficacy analyses only on the patients progressing to full implantation, and thus are comparable to enriched enrollment trials of new drugs, but are not comparable with the more classic parallel-design, placebo-controlled drug trial or any type of crossover trial.

Device-based investigation faces financial, regulatory, and insurance barriers combined with limited funding.2,20 Only 3% of total neuroscience research funding in 2005 was directed toward medical devices.3 Only 2 rTMS/tDCS trials were registered with an industry primary sponsor. Both rTMS and tDCS are still early in their development, with tDCS not yet approved by the FDA, and rTMS only approved for depression. Many questions remain, such as whether to target deep or superficial structures, how to best apply “sham” stimulation, and the optimum frequency and duration of stimulation sessions to achieve an enduring effect.1,22,26,40,43,52 The market potential of rTMS and tDCS, an important incentive for large-scale industry partnership, is not certain. Spinal cord stimulation, which has been FDA-approved for more than 20 years, has a much larger proportion of industry-funded trials. In the RReADS-SCS database, almost half (45%) of trials eligible for a results search had been registered with an industry primary sponsor.

The new RReACT databases and the RReADS databases suggest a different industry role than in PHN, DPN, FM, and migraine, which all have multiple approved drugs and are frequently targeted in phase 2/3 new compound development programs. Only 2 CRPS trials (of AV-411 and lenalidomide), and no CPSP trials, appeared to represent a pharma new oral drug registration effort. Patients with CRPS may be involved in litigation or disability claims, and many CPSP sufferers have too much neurological impairment to serve as trial subjects. Regulatory approval pathways for therapeutic devices are different from those governing new drug development, and might explain the role of industry in the SCS and rTMS/tDCS trials.32,41

We found the proportion of trials without available results to be similar across a diverse range of pain disorders and treatment strategies, as has been shown to be true in psychiatric disorders and other medical conditions.10,11,13,17,19,23,24,33,36,48,50,51 Results are rarely available within a year of trial completion, as would be expected, but the effect of time is similar across all conditions for which this analysis could be conducted. For studies providing a specified completion date of at least 12 months before data collection, the percentage only rises to 69% for any results and to 57% for peer-reviewed results. Extending the window to 24 months for results to become available only slightly increases these percentages. In migraine, studies with primary industry sponsorship were more likely to be published, but industry sponsorship had no apparent effect in CRPS, CPSP, and device trials.4 For disorders difficult to study using a typical randomized controlled trial design (such as CRPS and CPSP), there may be many fewer trials, but the ones that are conducted are no less likely to have available results. Device trials have substantial design and blinding issues, but not a distinct publication issue.

For all disease areas, what can be done to increase the proportion of registered trials with available results? First, groups advocating for increased transparency in clinical trials research deserve support. For example, AllTrials (http://www.alltrials.net) is an initiative that includes BMJ, the Cochrane Collaboration, PLOS, and the Dartmouth Institute for Health Policy & Clinical Practice.11 Second, existing regulations require posting of study results within 12 months of study completion for certain categories of trials. Compliance is poor and enforcement could be better.10,24,25,33 Third, large registries besides CTG could enable direct posting of study results and better support investigators who wish to do so. Fourth, far too much manual searching was required to create RReACT, RReMiT, and RReADS. For CRPS and CPSP, 17% of trials had 1 or more published article(s) linked to the trial record, a proportion that increased to 38% when PubMed was manually searched. For rTMS, tDCS, and SCS, the proportion increased from 5% of trials to 39%; and in RReMiT, the proportion increased from 20% to 45% through manual searches. If all journals would provide the trial registration number, preferably in the abstract, accurate links could become automatic on all registries.4 Fifth, the apparent barriers to publishing negative trials must be reduced, a focus of Project OPEN (http://www.open-project.eu/welcome).4,30,39,55 “Negative” clinical trials have lower publication rates and take longer to be published, although it remains uncertain whether the impact factor of the journal they eventually appear in is lower.19,48 Peer-reviewed journals welcoming unexpected, controversial, provocative, and/or negative results such as the Journal of Negative Results in Biomedicine are a step forward but not necessarily feasible for investigator-initiated trials with insufficient funds to pay publication costs. Could a narrowly focused, peer-reviewed clinical trials “brief communications” journal using a standardized results-reporting format and offering authors' limited statistical assistance succeed?

The RReACT, RReMiT, and RReADS databases are not designed to delve deeper into issues of publication bias and selective reporting of results. However, by spanning almost 1000 trials in 6 disorders and 2 types of devices, a universal problem is clearly apparent. Irrespective of time since completion, more than half of all registered clinical trials do not have publicly available results. Evidence-based medicine must more rigorously take into account the sheer magnitude of missing results in formulating treatment recommendations.

Back to Top | Article Outline

Conflict of interest statement

Supported by a research subaward from the Food and Drug Administration (FDA) U101 FD004187-02 (University of Rochester, R. H. Dworkin, PI), Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks (ACTTION) public–private partnership with the U.S. FDA. The authors declare no conflicts of interest.

Back to Top | Article Outline

Appendix A. Supplemental Digital Content

Supplemental Digital Content associated with this article can be found online at http://links.lww.com/PAIN/A6, http://links.lww.com/PAIN/A7, http://links.lww.com/PAIN/A8 and http://links.lww.com/PAIN/A9.

Back to Top | Article Outline

References

[1]. Arana AB, Borckardt JJ, Ricci R, Anderson B, Li X, Linder KJ, Long J, Sackeim HA, George MS. Focal electrical stimulation as a sham control for repetitive transcranial magnetic stimulation: does it truly mimic the cutaneous sensation and pain of active prefrontal repetitive transcranial magnetic stimulation? Brain Stimul 2008;1:44–51.
[2]. Bergsland J, Elle OJ, Fosse E. Barriers to medical device innovation. Med Devices (Auckl) 2014;7:205–9.
[3]. Dorsey ER, Vitticore P, De Roulet J, Thompson JP, Carrasco M, Johnston SC, Holloway RG, Moses H III. Financial anatomy of neuroscience research. Ann Neurol 2006;60:652–9.
[4]. Dufka FL, Dworkin RH, Rowbotham MC. How transparent are migraine clinical trials? Repository of Registered Migraine Trials (RReMiT). Neurology 2014;83:1–10.
[5]. Dwan K, Altman DG, Cresswell L, Blundell M, Gamble CL, Williamson PR. Comparison of protocols and registry entries to published reports for randomised controlled trials. Cochrane Database Syst Rev 2011:MR000031.
[6]. Ergina PL, Cook JA, Blazeby JM, Boutron I, Clavien PA, Reeves BC, Seiler CM; Balliol Collaboration, Altman DG, Aronson JK, Barkun JS, Campbell WB, Cook JA, Feldman LS, Flum DR, Glasziou P, Maddern GJ, Marshall JC, McCulloch P, Nicholl J, Strasberg SM, Meakins JL, Ashby D, Black N, Bunker J, Burton M, Campbell M, Chalkidou K, Chalmers I, de Leval M, Deeks J, Grant A, Gray M, Greenhalgh R, Jenicek M, Kehoe S, Lilford R, Littlejohns P, Loke Y, Madhock R, McPherson K, Rothwell P, Summerskill B, Taggart D, Tekkis P, Thompson M, Treasure T, Trohler U, Vandenbroucke J. Challenges in evaluating surgical innovation. Lancet 2009;374:1097–104.
[7]. Food and Drug Administration Amendments Act of 2007. Title VIII: Clinical trials databases, Section 801. 110th Congress Public Law 85. DOCID: f:publ085.110 [Internet]. 2007. Available at: http://www.gpo.gov/fdsys/pkg/PLAW-110publ85/html/PLAW-110publ85.htm. Accessed July 10, 2014.
[8]. Food and Drug Administration. Code of federal regulations. Title 21, Volume 8. Sec. 882.5805 [Internet]. Available at: http://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfcfr/cfrsearch.cfm?fr=882.5805. Revised April 1, 2013. Accessed July 2, 2014.
[9]. Food and Drug Administration. Code of federal regulations. Title 21, Volume 8. Sec. 882.5880 [Internet]. Available at: http://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfcfr/cfrsearch.cfm?fr=882.5880. Revised April 1, 2013. Accessed July 2, 2014.
[10]. Gill CJ. How often do US-based human subjects research studies register on time, and how often do they post their results? A statistical analysis of the Clinicaltrials.gov database. BMJ Open 2012;2:e001186.
[11]. Goldacre B. Are clinical trial data shared sufficiently today? No. BMJ 2013;347:f1880.
[12]. Greene K, Dworkin RH, Rowbotham MC. A snapshot and scorecard for analgesic clinical trials for chronic pain: the RReACT database. PAIN 2012;153:1794–7.
[13]. Hart B, Lundh A, Bero L. Effect of reporting bias on meta-analyses of drug trials: reanalysis of meta-analyses. BMJ 2012;344:d7202.
[14]. Higgins JPT, Altman DG, Sterne JAC, editors. Chapter 8: assessing risk of bias in included studies. In: Higgins JPT, Green S, editors. Cochrane handbook for systematic reviews of interventions. Version 5.1.0 [updated March 2011]. The Cochrane Collaboration, 2011. Available at: www.cochrane-handbook.org. Accessed November 26, 2014.
[15]. Huser V, Cimino JJ. Precision and negative predictive value of links between ClinicalTrials.gov and PubMed. AMIA Annu Symp Proc 2012;2012:400–8.
[16]. Huser V, Cimino JJ. Evaluating adherence to the International Committee of Medical Journal Editors' policy of mandatory, timely clinical trial registration. J Am Med Inform Assoc 2013;20:e169–74.
[17]. Huser V, Cimino JJ. Linking ClinicalTrials.gov and PubMed to track results of interventional human clinical trials. PLoS One 2013;8:e68409.
[18]. Jones CW, Handler L, Crowell KE, Keil LG, Weaver MA, Platts-Mills TF. Non-publication of large randomized clinical trials: cross sectional analysis. BMJ 2013;347:f6104.
[19]. Kanaan Z, Galandiuk S, Abby M, Shannon KV, Dajani D, Hicks N, Rai SN. The value of lesser-impact-factor surgical journals as a source of negative and inconclusive outcomes reporting. Ann Surg 2011;253:619–23.
[20]. Kelly ML, Malone D, Okun MS, Booth J, Machado AG. Barriers to investigator-initiated deep brain stimulation and device research. Neurology 2014;82:1465–73.
[21]. Kessler L, Ramsey SD, Tunis S, Sullivan SD. Clinical use of medical devices in the “Bermuda Triangle”. Health Aff (Millwood) 2004;23:200–7.
[22]. Kessler SK, Turkeltaub PE, Benson JG, Hamilton RH. Differences in the experience of active and sham transcranial direct current stimulation. Brain Stimul 2012;5:155–62.
[23]. Kirillova O. Results and outcome reporting in ClinicalTrials.gov, what makes it happen? PLoS One 2012;7:e37847.
[24]. Kuehn BM. Few studies reporting results at US government clinical trials site. JAMA 2012;307:651–53.
[25]. Law MR, Kawasumi Y, Morgan SG. Despite law, fewer than one in eight completed studies of drugs and biologics are reported on time on ClinicalTrials.gov. Health Aff (Millwood) 2011;30:2338–45.
[26]. Loo CK, Taylor JL, Gandevia SC, McDarmont BN, Mitchell PB, Sachdev PS. Transcranial magnetic stimulation (TMS) in controlled treatment studies: are some “sham” forms active? Biol Psychiatry 2000;47:325–31.
[27]. Mathieu S, Boutron I, Moher D, Altman DG, Ravaud P. Comparison of registered and published primary outcomes in randomized controlled trials. JAMA 2009;302:977–84.
[28]. McGauran N, Wieseler B, Kreis J, Schüler YB, Kölsch H, Kaiser T. Reporting bias in medical research—a narrative review. Trials 2010;11:37.
[29]. Meakins JL. Innovation in surgery: the rules of evidence. Am J Surg 2002;183:399–405.
[30]. Munch T, Dufka FL, Greene K, Smith SM, Dworkin RH, Rowbotham MC. RReACT goes global: perils and pitfalls of constructing a global open-access database of registered analgesic clinical trials and trial results. PAIN 2014;155:1313–17.
[31]. O'Connell NE, Wand BM, Marston L, Spencer S, Desouza LH. Non-invasive brain stimulation techniques for chronic pain. Cochrane Database Syst Rev 2014;4:CD008208.
[32]. Peña C, Bowsher K, Costello A, De Luca R, Doll S, Li K, Schroeder M, Stevens T. An overview of FDA medical device regulation as it relates to deep brain stimulation devices. IEEE Trans Neural Syst Rehabil Eng 2007;15:421–4.
[33]. Prayle AP, Hurley MN, Smyth AR. Compliance with mandatory reporting of clinical trial results on ClinicalTrials.gov: cross sectional study. BMJ 2012;344:d7373.
[34]. Priori A, Ciocca M, Parazzini M, Vergari M, Ferrucci R. Transcranial cerebellar direct current stimulation and transcutaneous spinal cord direct current stimulation as innovative tools for neuroscientists. J Physiol 2014;592(pt 16):3345–69.
[35]. Rising K, Bacchetti P, Bero L. Reporting bias in drug trials submitted to the Food and Drug Administration: review of publication and presentation. PLoS Med 2008;5:e217; discussion e217.
[36]. Ross JS, Mulvey GK, Hines EM, Nissen SE, Krumholz HM. Trial publication after registration in ClinicalTrials.gov: a cross-sectional analysis. PLoS Med 2009;6:1–9.
[37]. Ross JS, Tse T, Zarin DA, Xu H, Zhou L, Krumholz HM. Publication of NIH funded trials registered in ClinicalTrials.gov: cross sectional analysis. BMJ 2012;344:d7292.
[38]. Rowbotham MC. The impact of selective publication on clinical research in pain. PAIN 2008;140:401–4.
[39]. Rowbotham MC. The case for publishing “negative” clinical trials. PAIN 2009;146:225–6.
[40]. Russo R, Wallace D, Fitzgerald PB, Cooper NR. Perception of comfort during active and sham transcranial direct current stimulation: a double blind study. Brain Stimul 2013;6:946–51.
[41]. Sastry A. Overview of the US FDA medical device approval process. Curr Cardiol Rep 2014;16:494.
[42]. Schlaepfer TE, Fins JJ. Deep brain stimulation and the neuroethics of responsible publishing: when one is not enough. JAMA 2010;303:775–6.
[43]. Sheffer CE, Mennemeier MS, Landes RD, Dornhoffer J, Kimbrell T, Bickel WK, Brackman S, Chelette KC, Brown G, Vuong M. Focal electrical stimulation as an effective sham control for active rTMS and biofeedback treatments. Appl Psychophysiol Biofeedback 2013;38:171–6.
[44]. Smith SM, Wang AT, Pereira A, Chang RD, McKeown A, Greene K, Rowbotham MC, Burke LB, Coplan P, Gilron I, Hertz SH, Katz NP, Lin AH, McDermott MP, Papadopoulos EJ, Rappaport BA, Sweeney M, Turk DC, Dworkin RH. Discrepancies between registered and published primary outcome specifications in analgesic trials: ACTTION systematic review and recommendations. PAIN 2013;154:2769–74.
[45]. Song F, Parekh S, Hooper L, Loke YK, Ryder J, Sutton AJ, Hing C, Kwok CS, Pang C, Harvey I. Dissemination and publication of research findings: an updated review of related biases. Health Technol Assess 2010;14:iii, ix–xi, 1–193.
[46]. Song JJ, Popescu A, Bell RL. Present and potential use of spinal cord stimulation to control chronic pain. Pain Physician 2014;17:235–46.
[47]. Strech D. Normative arguments and new solutions for the unbiased registration and publication of clinical trials. J Clin Epidemiol 2012;65:276–81.
[48]. Suñé P, Suñé JM, Montoro JB. Positive outcomes influence the rate and time to publication, but not the impact factor of publications of clinical trial results. PLoS One 2013;8:e54583.
[49]. Tse T, Williams RJ, Zarin DA. Reporting “basic results” in ClinicalTrials.gov. Chest 2009;136:295–303.
[50]. Turner EH, Knoepflmacher D, Shapley L. Publication bias in antipsychotic trials: an analysis of efficacy comparing the published literature to the US Food and Drug Administration database. PLoS Med 2012;9:e1001189.
[51]. Turner EH, Matthews AM, Linardatos E, Tell RA, Rosenthal R. Selective publication of antidepressant trials and its influence on apparent efficacy. N Engl J Med 2008;358:252–60.
[52]. Tzabazis A, Aparici CM, Rowbotham MC, Schneider MB, Etkin A, Yeomans DC. Shaped magnetic field pulses by multi-coil repetitive transcranial magnetic stimulation (rTMS) differentially modulate anterior cingulate cortex responses and pain in volunteers and fibromyalgia patients. Mol Pain 2013;9:33.
[53]. van de Wetering FT, Scholten RJ, Haring T, Clarke M, Hooft L. Trial registration numbers are underreported in biomedical publications. PLoS One 2012;7:e49599.
[54]. Vedula SS, Bero L, Scherer RW, Dickersin K. Outcome reporting in industry-sponsored trials of gabapentin for off-label use. N Engl J Med 2009;361:1963–71.
[55]. Wager E, Williams P; Project Overcome failure to Publish nEgative fiNdings Consortium. “Hardly worth the effort?” Medical journals' policies and their editors' and publishers' views on trial registration and publication bias: quantitative and qualitative study. BMJ 2013;347:f5248.
[56]. Whittington CJ, Kendall T, Fonagy P, Cottrell D, Cotgrove A, Boddington E. Selective serotonin reuptake inhibitors in childhood depression: systematic review of published versus unpublished data. Lancet 2004;363:1341–5.
[57]. World Health Organization. International Standards for Clinical Trial Registries [Internet]. 2012. Available at: http://apps.who.int/iris/bitstream/10665/76705/1/9789241504294_eng.pdf. Accessed May 23, 2013.
[58]. World Health Organization Registry Network. About registries: WHO Registry Criteria (version 2.1, April 2009) [Internet]. 2013. Available at: http://www.who.int/ictrp/network/criteria_summary/en/index.html. Accessed August 23, 2013.
[59]. Zarin DA, Tse T. Medicine. Moving toward transparency of clinical trials. Science 2008;319:1340–2.
[60]. Zarin DA, Tse T, Williams RJ, Califf RM, Ide NC. The ClinicalTrials.gov results database: update and key issues. N Engl J Med 2011;364:852–60.
Keywords:

Clinical trials; Transparency in research; Registries; RReACT; RReADS; RReMiT; Global databases; Publication bias; Selective reporting

Supplemental Digital Content

Back to Top | Article Outline
© 2015 International Association for the Study of Pain