The growing field of eHealth, defined as the use of information and communications technology for health,135 has the potential to provide numerous benefits for consumers and health systems, such as improving the accessibility and cost-effectiveness of health care.37,81 A plethora of eHealth tools (eg, mobile applications and online interventions) have been developed for a range of health conditions and symptoms,90,98,102 including acute and chronic pain,54,69,117 with promising results regarding their effectiveness.81 Although considerable grant funding may be spent on the development and evaluation of evidence-based eHealth tools for pain with the rationale that they will improve access to care, current evidence suggests that end users (ie, targeted users of the tool) do not have access to these tools outside research studies, reducing their real-world impact.81 For example, one systematic review identified 47 articles describing 34 pain-related mobile applications (apps), none of which were publically available in app stores.66 Reviews of publically available pain-related mobile apps found that 0/283 apps were reported on in published scientific literature66 and only 1/279 apps was empirically evaluated.68
The possibility that significant research funding is spent developing eHealth tools that are never made available to end users is a concerning example of potential “research waste” (ie, avoidable waste in the research process21). Barriers to making research outputs available (eg, through commercialization) that are commonly endorsed by researchers include lack of financial support, dedicated time, industry partnerships, and sufficient institutional infrastructure.77,129 Tool availability may be facilitated by researchers' intrinsic motivations,70 user-centered design processes (ie, methodology based on the premise that involving end users throughout the tool design process improves user experience and effectiveness136), and implementation efforts targeted to eHealth contexts and users.43,76
Understanding the barriers and facilitators associated with the availability of pain-related eHealth tools to end users is critical for improving tool availability and reducing potential research waste. This study systematically reviewed eHealth tools for pediatric pain across tool types (eg, mobile apps and online interventions), goals (eg, assessment or management), and pain types (eg, acute, chronic, and disease-related pain). It focused on pediatric pain because pain is a common experience for children and adolescents35,63 and eHealth tools are often the preferred modality among this age group.110 This study makes novel contributions to the field by examining potential barriers and facilitators (including user-centered design processes) to pediatric pain eHealth tool availability and quantifying funding used to develop and evaluate available and unavailable tools.
This study consisted of 2 components: (1) a systematic review of eHealth tools for pediatric pain assessment and/or management published in the past 10 years, and (2) a survey completed by authors of the identified tools regarding their eHealth tool's availability, barriers or facilitators to availability, tool design process, and grant funding used. The 10-year timeline was chosen given the rapid changes in technology in the field of eHealth tools. It was hypothesized that: (1) many researcher-developed eHealth tools meeting inclusion criteria would be unavailable to end users in any form, (2) researcher-reported barriers to tool availability would include outdated technology and system-level barriers (eg, lack of funding and insufficient institutional infrastructure), and facilitators would include researchers' individual beliefs about the importance of tool availability, and (3) considerable research funding would be spent on tools that are unavailable to end users.
2.1. Systematic review
The systematic review was conducted following established guidelines for this method55,87 and was registered with Prospero (registration: CRD42017069910).
2.1.1. Search strategy
A systematic search was conducted in 4 databases (PubMed, PsycINFO, CINAHL, and EMBASE) from May 2 to 3, 2017. The search strategy included terms for eHealth (eg, eHealth and mHealth), pain (eg, pain, chronic pain, headaches, and needles), assessment and management (eg, assessment, intervention, and therapy), and children (eg, child and pediatrics) (see Appendix A for full search strategy, available at https://links.lww.com/PR9/A31). Medical subject headings (MeSH terms) and other key terms were also used. The search strategy was developed in consultation with a librarian and experts in the fields of pediatric pain and eHealth tool development, and followed previous research validating optimal search terms for identifying pediatric research.73 Additional hand searching of previous reviews on similar topics66,114 and reference sections of included full-text articles was also completed.
2.1.2. Eligibility criteria
Articles were eligible for inclusion in the review if they met the following criteria: (1) the article described an empirical study (published study or unpublished dissertation) written in English and published within the past 10 years (ie, January 1, 2007–May 3, 2017), (2) the article described the development of an eHealth tool for pediatric pain assessment or management and/or evaluated the benefits of its use in the target population, and (3) the tool was studied in children and adolescents (aged 0–18 years, sample median age <19 years) or their parents/caregivers. The 10-year timeline for the systematic search was chosen given the rapid evolution of eHealth technology that can quickly lead to developed eHealth tools becoming out of date.
2.1.3. Study selection and data extraction
Citations identified in the systematic search were imported into Covidence systematic review software31 and results were deduplicated. Two study authors (K.S.H. and P.R.T.) independently reviewed the titles and abstracts of each remaining citation to determine whether they met inclusion criteria; discrepancies were resolved by consensus. Next, a full-text screen was completed of all remaining articles (by K.S.H. and P.R.T.) to determine final inclusion and discrepancies were again resolved by consensus. Before completing data extraction, a list of all included articles, organized by tool, was developed by one author (K.S.H.) with any unclear situations resolved through consultation with the other team members.
A data extraction manual was developed by the authors for this review. During data extraction, information was collected in the following areas: study identifiers (eg, title, corresponding name and contact author, and name of eHealth tool described), tool focus (eg, pain assessment, management, or combination of both; aimed at children/adolescents or their parents/caregivers; and type of pain tool was intended for), characteristics of the tool (eg, type of tool, theory on which it was based), and information on the study described (eg, study design, participants, and results). Whether or not each article evaluated the following aspects of the eHealth tool was recorded: functionality (ie, whether the tool worked correctly and did the functions it was designed to do), usability (ie, ease of use, ease of learning to use the tool, and degree to which the tool could be used by the target population to achieve outcomes with effectiveness and efficiency), accessibility (eg, compliance with Web Content Accessibility Guidelines 2.0 [WCAG 2.0133], guidelines for making web content accessible for individuals with various disabilities developed collaboratively with international input from industry, governments, researchers, and disability organizations), user experience (ie, participants' experiences with using the tool, and perceptions and responses resulting from using the tool), and feasibility (ie, whether the tool was suitable to use in the target population or practical for use in everyday life). When one article described multiple studies or samples, data from each were extracted separately. Data were extracted from each article by 2 independent coders (K.S.H. and P.R.T.) and entered into an Excel spreadsheet; disagreements were resolved by consensus, consistent with established systematic review guidelines.55 Descriptive statistics were used to summarize the results of the data extraction process.
After identification of the included eHealth tools, tool availability was examined by searching Google and in app stores (Apple App Store, Google Play, Windows Store, and Blackberry World, following methods from previous reviews66,68). These searches served to gather availability information for all tools meeting inclusion criteria and supplemented the voluntary tool author survey data. A tool was considered available if any means of accessing the tool was evident based on these searches (eg, available for free or for purchase, with a specific user account or health system, from a website, app store, or other location, etc.). These searches were conducted independently by 2 individuals, with discrepancies discussed and resolved by a third party (K.S.H.). Availability status and study of particular characteristics of the eHealth tools (functionality, usability, accessibility, user experience, and feasibility) were compared across tool types (pain assessment tools, pain management tools, and tools combining assessment and management functions) using χ2 tests.
2.2. Author survey
Corresponding author contact information was identified from each included publication and authors were invited by email to take part in the survey. If the email address provided was no longer active, a coauthor whose contact information was available was invited to participate. Where the same eHealth tool was described in multiple articles, only the corresponding or alternate coauthor of the original article was invited to participate in the study.
Authors were invited to complete the survey either by telephone or online based on their preference; all participants chose to complete the survey online. The survey was developed for the current study and included questions modified from previous surveys addressing similar topics in different research fields.3,77,129 The survey also included a previously developed validated measure of the user-centeredness of a tool's design process, the User-Centered Design-11 (UCD-11128). Survey questions addressed the following topics: (1) availability of the eHealth tool and, if available, its form (eg, mobile app, online intervention) and cost (free or paid), (2) barriers or facilitators to making the tool available (questions modified from Refs. 77,129; authors rated the extent to which each of 17 items was a barrier/facilitator of making their tool available on a 5-point scale ranging from strongly disagree to strongly agree), (3) the amount and currency of grant funding secured for developing and/or evaluating the tool and making it available to the public, if applicable, (4) the user-centeredness of the tool's design process (UCD-11128; 11 yes/no items regarding use of user-centered design strategies in tool development, original study α = 0.74, current study α = 0.85), and (5) author demographic information (questions modified from 3). A copy of the survey is available in Appendix B (available at https://links.lww.com/PR9/A31). To protect participant privacy, survey responses were not linked to a particular eHealth tool (ie, the survey was designed to be anonymous); however, participants were informed in the consent procedures that anonymity could not be guaranteed given the small, publically known pool of potential participants for the study.
The study protocol was approved by the IWK Health Centre Research Ethics Board. Authors of included articles were invited to complete the survey or to forward the invitation to a coauthor if they believed they could better report on the described eHealth tool (eg, corresponding author was a trainee and felt their supervisor could better report on the tool). The email invitation introduced participants to the study and included a link to an online consent form with additional information about the study. Authors were also informed that they could decline to participate in the study and would not be contacted further. Reminder emails were sent 2 and 3 weeks after the initial invitation to those authors who had not yet contacted the research team (eg, to set up a telephone interview or to decline study participation).
All participants opted to complete the survey online. They did so by clicking on a link provided in the invitation email. They were provided with an online version of the study consent form and asked to click “I agree” to indicate their consent to participate in the study. Participants could return to previous pages to change their responses if desired. The online survey process took approximately 20 minutes to complete.
No compensation was provided for participating in the study. It is considered common practice by researchers in many fields to share additional information about their publications when requested by other researchers (eg, to provide study summary statistics for inclusion in meta-analyses). This survey invitation was considered to be an extension of this type of academic collaboration.
2.2.4. Data analysis
Descriptive statistics were used to describe the results of the author survey. Survey responses were separated into 2 groups based on whether they pertained to tools available or unavailable to end users. The frequency with which each barrier and facilitator of tool availability was endorsed at each level and the average score for each item was calculated to determine the most strongly endorsed barriers or facilitators in each sample subgroup. All reported grant funding amounts were converted to US dollars (USD) using exchange rates on November 13, 2017, before calculating total and average amounts of funding spent on the development and evaluation of available and unavailable tools. Authors' responses to the UCD-11 were totaled, and an independent-samples t test was used to compare available and unavailable tools on their total UCD-11 scores.
3.1. Systematic review
The screening process for the systematic review is illustrated using the PRISMA model87 in Figure 1. The systematic search identified 14,285 citations; 3,817 duplicates were removed, resulting in 10,468 unique citations. The titles and abstracts of the 10,468 citations were screened, resulting in 215 articles identified as relevant for full-text review. The full-text review resulted in 90 articles meeting inclusion criteria, describing 53 unique eHealth tools across 97 studies. For descriptive information on the included eHealth tools and articles describing them, see Table 1.
Twenty-nine tools (54.71%) addressed pain management and 16 tools (30.19%) focused on pain assessment. Eight tools (15.09%) addressed both pain assessment and management. Thirty-four tools (64.15%) were intended for use by children and/or adolescents, 9 tools (16.98%) were intended for use by parents/caregivers of children and adolescents, and 10 tools (18.87%) were aimed at both. Half of the included eHealth tools were focused on chronic pain conditions (n = 26; 49.06%; eg, headaches, juvenile idiopathic arthritis); among the remainder, tools were intended for acute procedural pain (n = 7; 13.21%), cancer-related pain (n = 5; 9.43%), pain related to sickle cell disease (n = 5; 9.43%), postoperative pain (n = 4; 7.55%), or pain in children with cerebral palsy (n = 1; 1.89%). For 5 tools, a particular pain type was not specified (n = 5; 9.43%). Thirty-two tools could be used on computers (60.38%), 23 could be used on mobile devices (43.40%), and 12 could be used on other devices (22.64%; eg, personal digital assistants and devices developed by the authors); note that tools could be used on more than one device (both computers and mobile devices, n = 11; both computers and other devices, n = 2; and both mobile devices and other devices, n = 1). Twenty-eight tools were reported to be web-based (52.83%); others were reported to be iPhone/iOS apps (n = 5; 9.43%), Android apps (n = 1; 1.89%), other app types (n = 1; 1.89%), or other tool types (n = 22; 41.51%; eg, standalone devices and computer software programs). Seventeen tools reported being based on a particular theoretical model (32.08%; eg, cognitive behavioural therapy12; gate-control theory of pain82; Pender's Health Promotion Model99).
Among the 97 included studies, the most common study type was a validation study (n = 28; 28.87%), followed by randomized controlled trials (n = 21; 21.65%), observational studies (n = 18; 18.56%), “other” study types (n = 11; 11.34%; eg, implementation studies, single-case experimental designs, and feasibility studies), usability studies (n = 13; 13.40%), and pre–post designs (n = 6; 6.19%). None of the included studies described case studies or case series. The 97 studies described within 90 articles included a total of 9,035 children and adolescents, 3,314 parents/caregivers, 214 health care professionals, and 33 other participants (eg, researchers and survey respondents who did not specify a role). Fifty-seven studies described assessing tool feasibility using various methods (58.76%; eg, adherence to tool use during study, completion rates, and time required to complete interventions). Forty-nine studies reported assessing outcomes related to user experience (50.52%; eg, parent and child reports of acceptability or satisfaction with the tool, preference for eHealth tool over standard tools, and feedback on tool functions). Usability was described as being assessed in 30 studies (30.93%; eg, children's ability to use the tool effectively, children's understanding of tool functions, and ratings of ease of use), and 20 studies reported assessing functionality (20.62%; eg, rates of tool malfunctioning or usage errors). None of the studies described assessing accessibility (eg, WCAG 2.0 compliance). The majority of studies (96/97, 99.00%) reported at least some positive results regarding the eHealth tool examined (ie, results supporting the tool's efficacy or effectiveness for at least one outcome; results supporting the tool's usability, feasibility, etc.), and all tools (53/53, 100.00%) had studies reporting positive results on them.
Web searches conducted to examine the availability of each tool indicated that only 15 of the 53 identified eHealth tools were available to end users in some form (28.30%). Among the 15 available tools, 9 (60.00%) required some type of permission to gain access (eg, applying for the ability to access the tool; tool was only available to patients of particular clinics or research participants in particular studies). Four tools (26.67%) were found to be available through the Apple App Store, 4 through Google Play, and one through Windows Store.
There was no significant difference in the proportion of tools found to be available (as examined by web searches) based on tool type (pain assessment, pain management, or combined assessment/management tools), χ2(2) = 0.650, P > 0.05. Regarding tool characteristics studied, a lesser proportion of pain assessment tools (44.8%) had at least one study examining their user experience compared with pain management tools (75.0%) or combination tools (87.5%), χ2(2) = 6.821, P < 0.05. A lesser proportion of pain management tools had at least one study examining usability (12.5%) compared with the other tool types (assessment: 51.7%; combination: 62.5%), χ2(2) = 8.244, P < 0.05. There were no significant differences in proportions of tool types having had functionality (χ2(2) = 0.810, P > 0.05), feasibility (χ2(2) = 0.119, P > 0.05), or accessibility (0 studies of any tool type) examined.
3.2. Author survey
Corresponding authors for each of the 53 unique eHealth tools were invited to participate in the survey. Authors (n = 4) who were corresponding author for more than one eHealth tool were asked to complete the survey once for each of their eHealth tools. Twenty-six responses to the online survey were received (49.06% response rate).
3.2.1. Availability of eHealth tools
Among the 26 responses received, 13 tools (50.00%) were identified as currently being available in some form. These tools were reported to be available to the general public (n = 5; 38.46%) or to patients of a particular clinic or health system (n = 8; 61.54%). Ten tools were reportedly available on websites (76.92%), 3 on the Apple app store (23.08%), one through Android app store (7.69%), and one through specialty clinic (7.69%; tools could be available in more than one location). Twelve available tools (92.31%) were reported to be free of cost; one respondent abstained from responding about cost. Regarding ownership, available tools were reported to be owned by the author's institution (n = 9; 69.23%), by both the author and their institution (n = 2; 15.38%), by the author themselves (n = 1; 7.69%), or by creative commons license (n = 1; 7.69%). Only 4 authors of available tools reported attempts to commercialize their tool (30.78%).
3.2.2. Facilitators and barriers to eHealth tool availability
220.127.116.11. Available tools
Descriptive statistics summarizing author responses regarding facilitators of tool availability are provided in Table 2 (note that authors only completed this section of the survey if they reported that their tool was currently available to end users in some form). The most commonly endorsed facilitators (ie, those with the highest mean scores where 5 = strongly agree and 1 = strongly disagree) were (1) belief in benefit to society/target population, (2) belief that making tool available is important to your research field, (3) belief that making tool available is important to academia, (4) tool had promising clinical/commercial application, and (5) financial support.
18.104.22.168. Unavailable tools
See Table 3 for frequencies of author responses regarding barriers impeding tool availability (note that authors only completed this section of the survey if they reported that their tool was not currently available to end users in any form). The most commonly endorsed factors (ie, those with the highest mean scores) were (1) lack of infrastructure to support tool availability, (2) lack of time, (3) not aware of how to commercialize or make tool available, (4) lack of industry partners, and (5) outdated technology of tool.
22.214.171.124. Available tools
Authors of 12 of the 13 available tools answered the survey questions about grant funding amounts; a total of $5,699,146.46 USD in grant funding was reportedly spent on the development and testing of all these eHealth tools combined (n = 12 tools with completed funding information; for one tool, grant sources but not amounts were reported). The average amount of grant funding reportedly spent on development and testing of each eHealth tool was $474,928.87 USD (n = 12 tools included in calculation; median = $157,971.96, range = $2,508,250.00, interquartile range = $671,033.90). When asked about the funding spent specifically on making the tool available to end users, 8 authors reported a total of $165,900.00 USD was used; 5 authors reported that they preferred not to answer this question. For 4 of 13 tools (30.77%), the principal investigator had reportedly budgeted in the original grant for work that would make the tool available to end users. Six of the responses (46.15%) indicated that making the tool available to end users was an expectation of the grant funding. Five responses (38.46%) indicated that additional funding was secured for making the eHealth tool available to end users.
126.96.36.199. Unavailable tools
Authors of 10 unavailable tools answered the survey questions about grant funding amounts; a total of $3,144,253.06 USD in grant funding was reportedly spent on the development and testing of all these eHealth tools combined (n = 10 tools with completed funding information; for one tool, grant sources but not amounts were reported, and 2 authors indicated that they preferred not to answer this question). Three tools were reported to have had no grant funding used in the development and testing of the tool. The average amount of funding reportedly spent on each unavailable tool was $314,425.31 USD (n = 10 tools included in calculation; median = $11,336.50, range = $2,207,260.00, interquartile range = $347,975.54).
3.2.4. Design process
188.8.131.52. Available tools
Among the 13 eHealth tools available to end users, 7 (53.85%) were reportedly developed by a contracted third party, including private companies (n = 3), a combination of private company and in-house development (n = 2), other organizations (n = 1), or freelancers (n = 1). Six tools (46.15%) were reportedly developed in-house. Steps followed in the user-centered design process are depicted in Table 4. Available tools scored significantly higher on user-centeredness of the design process (ie, UCD-11 total score) compared with the unavailable tools, t(16.01) = 2.33, P < 0.05.
184.108.40.206. Unavailable tools
Among the 13 unavailable eHealth tools, 7 were reportedly developed in-house (53.85%) and 6 (46.15%) were contracted out (n = 5 to a private company; n = 1 to a student). Steps followed in the user-centered design process are shown in Table 4.
3.2.5. Author demographics
220.127.116.11. Available tools
Among the 13 tools reportedly available to end users in some form, eleven had authors who identified primarily as researchers; 2 identified as “other.” Most authors were primarily affiliated with an academic institution (n = 7), followed by a hospital (n = 4) or research centre (n = 2). Authors were most commonly situated within the fields of nursing (n = 7), psychology (n = 4), or other (n = 2; pain, computer science). Available tools most often had authors who self-identified as mid career (10–20 years; n = 7), early career (less than 10 years; n = 4), or senior career (more than 20 years; n = 2). No available tools were authored by trainees or postdoctoral fellows.
18.104.22.168. Unavailable tools
Of the 13 tools reported as unavailable to end users, eleven (84.62%) had authors who identified their primary role as researchers, and 2 (15.38%) had authors who identified primarily as clinicians. These authors were most commonly affiliated primarily with academic institutions (n = 10) or hospitals (n = 3). Tools were most commonly authored by individuals in the fields of psychology (n = 5), nursing (n = 4), anesthesia (n = 1), or other (n = 3; pain, pediatrics, surgery), and by authors identifying as mid career (n = 4), followed by early career (n = 3), senior career (n = 3), postdoctoral fellow (n = 2), or trainee (n = 1).
This study provides a systematic review of the recent empirical literature on eHealth tools for pediatric pain assessment and management and makes several novel contributions to this field. Results build on past research by describing author-reported barriers and facilitators to tool availability, funding secured for development and evaluation, and authors' use of user-centered tool design. As hypothesized, few pediatric pain–related eHealth tools reported in the published literature are available to end users (15/53 tools, 28.30%). This finding is consistent with previous research on mHealth tools for pain66 and web-based interventions for other health conditions,102 but must be considered in light of the fact that the current study included only published research. Thus, the extent to which unpublished tools are available to end users could not be determined.
Facilitators of tool availability most commonly endorsed by authors of available tools included personal beliefs in the importance of making their tools available to end users. This is consistent with past research on the role of researchers' intrinsic motivations in their engagement in research commercialization.70 Many of the barriers that authors of unavailable tools endorsed were system-level issues such as lack of infrastructure to support making tools available, consistent with previous studies of researcher-reported barriers to making research outputs available through commercialization.77,129 These results suggest that although some researchers may be motivated by strong personal beliefs to make their tools available, they are impeded from doing so because of systemic barriers and lack of support. These results contribute to the literature on the availability of pain-related eHealth tools by identifying potential targets for supporting researchers in making their eHealth tools available to end users.
Analysis of grant funding showed that on average, over $300,000 USD was spent to develop and/or evaluate each tool that was subsequently unavailable to end users (total of $3,144,253.06 USD, n = 10 tools included). This expenditure represents potentially wasted research resources and lost impact for end users. Results of this study showed that user-centered design processes were associated with tool availability, and thus, with reducing potential for research waste. This is consistent with the hypothesis that using user-centered design processes for eHealth tools may optimize tool design and adoption by end users136,137 and is a novel finding of the current study. These results may also reflect that user-centered design is more likely to be used by teams with greater funding availability or overall higher research quality, given that the consultations and iterative design required may take more time and resources than other methods.
Taken together, the results of this study suggest there may be little focus on implementation and commercialization processes by researchers who develop eHealth tools and perhaps by the academic and granting institutions in which they are situated. There are several possible contributing factors to this situation. Researchers may be most focused on demonstrating the efficacy or effectiveness of their tools; efforts to implement eHealth tools and other health interventions are often haphazard or ineffective.43 In addition, researchers may not be rewarded in career-relevant ways (eg, in consideration of tenure and promotion) for efforts such as implementation or commercialization, which do not map onto traditional metrics (eg, number of publications). Although commercialization efforts are often well considered by academic institutions, most tools are made available to users at no cost, thus realizing no financial benefit for the developer or their institution. Funding agencies may not prioritize knowledge translation and implementation and, thus, not provide appropriate budgets to support the initiation and maintenance of tool availability. Although the current study shows that academic eHealth tool developers are driven by intrinsic motivations for making their tools available to end users, it is likely difficult for many researchers to achieve this alone.
Several recommendations for improving the availability of eHealth tools for pediatric pain assessment and management can be made based on the results of the current study. It is important to recognize that individual researchers, end users, and larger systems (eg, academic institutions, granting agencies, and health systems) all have roles to play in improving eHealth tool availability and reducing potential research waste.86,91 Increased engagement of end users and other stakeholders from the earliest stages of the research process would allow researchers to better understand their priorities and perspectives and to develop more effective tools for end users, whether based in eHealth technology or not, because “sometimes the best solution is a human solution, not a technological one.”101 Researchers may be able to increase the chances of their tool becoming available and being effectively used by relevant end users by using user-centered design processes.109,136,137 Incorporating implementation research methods and planning for potential tool availability earlier in the research cycle may also be helpful. Research has been conceptualized as a pipeline11 from efficacy studies (examining the effect of interventions on clinical outcomes as studied under highly controlled conditions30) to effectiveness studies (examining the effect of interventions under less controlled, more realistic conditions108) to implementation research (studies of methods to promote uptake of research evidence or products into practice in intended contexts11,100). Given the rapid changes in technology that can lead to eHealth tools becoming out of date, reported as a barrier by developers of unavailable tools in the current study, efficient means of moving through the research pipeline while still covering each important component are needed. Increased use of study designs that facilitate efficient study of implementation outcomes earlier in the research process (eg, hybrid effectiveness-implementation designs, which explore tool effectiveness and implementation simultaneously13,33) may be particularly helpful to eHealth researchers. Researchers should consider such research designs as a potential avenue for moving research forward at a pace that better fits changing technology in the field of eHealth while maintaining thorough examination of tools' efficacy and effectiveness. Researchers developing eHealth tools may also benefit from additional training in patient-centered research, knowledge translation, and implementation science methods.10 Similarly, researchers should incorporate collection of cost-effectiveness data into research designs; lack of information regarding initial and maintenance costs has been identified as a reason that eHealth tools may not be adopted by end users.46 Use of these strategies may facilitate researchers in establishing the efficacy, effectiveness, safety, and usability of their eHealth tools, which are critical to establish before tool availability, in an efficient way that fits the pace of technology change in the field of eHealth.
Within academic institutions, reward structures could be altered to appropriately reward researchers for their engagement in the extensive effort required to make their eHealth tools available to end users. These efforts should be recognized in the assessment of research outcomes and researchers' performance measures. Such shifts have begun to occur in some organizations (eg, Faculty of Medicine, University of Toronto9). Academic institutions could also increase the culture of innovation and entrepreneurship among all faculties and better support researchers in exploring various pathways to tool availability, including commercialization. On the level of funding agencies, increased funding opportunities aimed at supporting tool availability (both initially and long term) and commercialization processes are needed. Granting agencies should require researchers to demonstrate plans for tool availability and sustainability in their proposals and should assist in developing partnerships between researchers and others (eg, industry partners) to enhance tool availability. Researchers have debated whether increased focus on commercialization is appropriate within the academic environment and the impact of this focus on research integrity.20 Other models of making tools available to end users, outside simply pursuing commercialization and profit exploitation, should be explored.
Results must be considered in light of several limitations. First, response bias likely impacted author survey results, as a greater proportion of authors of available tools completed the survey compared with authors of unavailable tools. Also, authors of unavailable tools who completed the survey may have greater personal interests in the topic than authors of unavailable tools who did not complete the survey. Given the limited response rate to the survey, it is unclear how the results of the study generalize to the broader population of researchers who have developed eHealth tools for pediatric pain. Second, our analytic approach was somewhat limited due to concerns about participant confidentiality within a small, publically known pool of potential participants. As such, survey responses and data extracted from published articles were not linked, relationships between tool characteristics (eg, focus on pain assessment vs management, target user population) and availability could not be explored, and author-reported grant information could not be verified. The extent of evidence supporting each tool was not examined, and thus the relationship between evidence base and tool availability could not be examined. Although all tools had at least some positive results reported on in the included studies, it may be that tools with less evidence supporting their use were less likely to become available to end users. Finally, this project focused on pediatric pain tools and may differ from what might be discovered among eHealth tools for adult pain or other health conditions.
Future research should extend beyond barriers and facilitators of eHealth tool availability by exploring the perspectives of other stakeholders involved in eHealth tool design, evaluation, and dissemination. End users, industry partners, and policy decision makers have important perspectives that should be explored.109 Currently, little is known about the demand for pediatric pain–related eHealth tools. Preferences for eHealth tools have been documented in some samples,109,122 and the prevalence of children's pain63,93,111 and access to technology26 suggests that eHealth tools may be useful in this population; however, study of the needs of target end users is required to better understand whether eHealth tools are appropriate for various pediatric pain populations. Similarly, research on the best methods for making tools available to pediatric populations (eg, standalone eHealth tools provided directly to end users vs eHealth tools provided as part of larger eHealth systems implemented within health care systems) is also needed. Future research should prospectively examine specific predictors of tool availability, such as the use of user-centered design processes and implementation research methods, to ensure that eHealth tools actually do benefit users and not contribute to potential research waste.
The authors have no conflict of interest to declare.
K.S. Higgins is supported by a CIHR Doctoral Research Award, a Maritime Strategy for Patient-Oriented Research (SPOR) Support Unit Student Award, and a Nova Scotia Health Research Foundation Scotia Support Grant awarded to C.T. Chambers. P.R. Tutelman is supported by a CIHR Vanier Canada Graduate Scholarship. C.T. Chambers is supported by a Tier 1 Canada Research Chair and is the senior author on this article. C.T. Chambers' research is also supported by the Canadian Institutes of Health Research (#139704). H.O. Witteman is supported by a Fonds de recherche du Québec—Santé Research Scholar Junior 1 career development award. The infrastructure for this study was provided by a Canada Foundation for Innovation grant to C.T. Chambers. Open access publication was funded by a Maritime SPOR Support Unit Open Access Publication Bursary awarded to K.S. Higgins.
The authors thank Alyssa Dickinson, Kristen Johnson, and Robin Parker for their assistance with this project.
Appendix A. Supplemental digital content
Supplemental digital content associated with this article can be found online at https://links.lww.com/PR9/A31.
. Ahola Kohut S, Stinson JN, Ruskin D, Forgeron P, Harris L, van Wyk M, Luca S, Campbell F. iPeer2Peer program: a pilot feasibility study in adolescents with chronic pain. PAIN 2016;157:1146–55.
. Alfven G. SMS pain diary: a method for real-time data capture of recurrent pain in childhood. Acta Paediatr 2010;99:1047–53.
. Altsitsiadis E, Toptsidou M, Bougiouklis K. H2M survey questionnaire for the identification of commercialisation training needs of Health Researchers. 2015. http://doi.org/10.5281/zenodo.31272
. Accessed February 28, 2017.
. Anand V, Spalding SJ. Leveraging electronic tablets and a readily available data capture platform to assess chronic pain in children: the PROBE system. Stud Health Technol Inform 2015;216:554–8.
. Armbrust W, Bos JJFJ, Cappon J, van Rossum MAJJ, Sauer PJJ, Wulffraat N, van Wijnen VK, Lelieveld OTHM. Design and acceptance of [email protected]
, a combined internet-based and in person instruction model, an interactive, educational, and cognitive behavioral program for children with juvenile idiopathic arthritis. Pediatr Rheumatol Online J 2015;13:1–13.
. Baggott C, Gibson F, Coll B, Kletter R, Zeltzer P, Miaskowski C. Initial evaluation of an electronic symptom diary for adolescents with cancer. JMIR Res Protoc 2012;1:e23.
. Bakshi N, Smith M, Ross D, Krishnamurti L. Novel metrics in the longitudinal evaluation of pain data in sickle cell disease. Clin J Pain 2017;33:517–527.
. Bakshi N, Stinson JN, Ross D, Lukombo I, Mittal N, Joshi SV, Belfer I, Krishnamurti L. Development, content validity, and user review of a web-based multidimensional pain diary for adolescent and young adults with sickle cell disease. Clin J Pain 2015;31:580–90.
. Barwick M. Building scientist capacity in knowledge translation: development of the knowledge translation planning template. Technol Innov Manag Rev 2016;6:9–15.
. Barwick M, Lockett DM, Buckley L, Goering P. Scientist knowledge translation training manual. Toronto: The Hospital for Sick Children/Centre for Addiction and Mental Health, 2005.
. Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM. An introduction to implementation science for the non-specialist. BMC Psychol 2015;3:32.
. Beck JS. Cognitive behavior therapy: basics and beyond. 2nd ed. New York: The Guilford Press, 2011.
. Bernet AC, Willens DE, Bauer MS. Effectiveness-implementation hybrid designs: implications for quality improvement science. Implement Sci 2013;8:S2.
. Bonnert M, Ljótsson B, Hedman E, Andersson J, Arnell H, Benninga MA, Simrén M, Thulin H, Thulin U, Vigerland S, Serlachius E, Olén O. Internet-delivered cognitive behavior therapy for adolescents with functional gastrointestinal disorders—an open trial. Internet Interv 2014;1:141–8.
. Bonnert M, Olen O, Lalouni M, Benninga MA, Bottai M, Engelbrektsson J, Hedman E, Lenhard F, Melin B, Simren M, Vigerland S, Serlachius E, Ljótsson B. Internet-delivered cognitive behavior therapy for adolescents with irritable bowel syndrome: a randomized controlled trial. Am J Gastroenterol 2017;112:152–62.
. Brandon TG, Becker BD, Bevans KB, Weiss PF. Patient-reported outcomes measurement information system tools for collecting patient-reported outcomes in children with juvenile arthritis. Arthritis Care Res (Hoboken) 2017;69:393–402.
. Brown NJ, Kimble RM, Rodger S, Ware RS, Cuttle L. Play and heal: randomized controlled trial of Ditto intervention efficacy on improving re-epithelialization in pediatric burns. Burns 2014;40:204–13.
. Butruille L, De jonckheere J, Marcilly R, Boog C, Bras da CS, Rakza T, Storme L, Logier R. Development of a pain monitoring device focused on newborn infant applications: the NeoDoloris project. IRBM 2015;36:80–5. doi:10.1016/j.irbm.2015.01.005.
. Castarlenas E, Sanchez-Rodriguez E, de la Vega R, Roset R, Miro J. Agreement between verbal and electronic versions of the Numerical Rating Scale (NRS-11) when used to assess pain intensity in adolescents. Clin J Pain 2015;31:229–34.
. Caulfield T, Ogbogu U. The commercialization of university-based research: balancing risks and benefits. BMC Med Ethics 2015;16:1–7.
. Chalmers I, Glasziou P. Avoidable waste in the production and reporting of research evidence. Lancet 2009;374:86–9.
. Chan EA, Chung JWY, Wong TKS, Lien ASY, Yang JY. Application of a virtual reality prototype for pain relief of pediatric burn in Taiwan. J Clin Nurs 2007;16:786–93.
. Chen Y, Chin M, Greenberg S, Johnstone C, McGuinness J. Post-tonsillectomy pain in 24 children—utilising short message service (SMS) to assess postoperative outcomes. Clin Otolaryngol 2012;37:412–14.
. Cheng C, Brown C, New T, Stokes TH, Dampier C, Wang MD. SickleREMOTE: a two-way text messaging system for pediatric sickle cell disease patients. IEEE EMBS Int Conf Biomed Heal Inform 2012;2012:408–11.
. Cohen LL, Rodrigues NP, Lim CS, Bearden DJ, Welkom JS, Joffe NE, McGrath PJ, Cousins LA. Automated parent-training for preschooler immunization pain relief: a randomized controlled trial. J Pediatr Psychol 2015;40:526–34.
. Common Sense Media. Zero to eight: children's media use in America 2013. 2013. Available at: www.commonsense.org/research
. Accessed June 29, 2018.
. Connelly M, Anthony KK, Sarniak R, Bromberg MH, Gil KM, Schanberg LE, Connelly M, Anthony KK, Sarniak R, Bromberg MH, Gil KM, Schanberg LE. Parent pain responses as predictors of daily activities and mood in children with juvenile idiopathic arthritis: the utility of electronic diaries. J Pain Symptom Manage 2010;39:579–90.
. Connelly M, Bromberg MH, Anthony KK, Gil KM, Franks L, Schanberg LE. Emotion regulation predicts pain and functioning in children with juvenile idiopathic arthritis: an electronic diary study. J Pediatr Psychol 2012;37:43–52.
. Connelly M, Miller T, Gerry G, Bickel J. Electronic momentary assessment of weather changes as a trigger of headaches in children. Headache 2010;50:779–89.
. Cooper SA, Desjardins PJ, Turk DC, Dworkin RH, Katz NP, Kehlet H, Ballantyne JC, Burke LB, Carragee E, Cowan P, Croll S, Dionne RA, Farrar JT, Gilron I, Gordon DB, Iyengar S, Jay GW, Kalso EA, Kerns RD, McDermott MP, Raja SN, Rappaport BA, Rauschkolb C, Royal MA, Segerdahl M, Stauffer JW, Todd KH, Vanhove GF, Wallace MS, West C, White RE, Wu C. Research design considerations for single-dose analgesic clinical trials in acute pain. PAIN 2016;157:288–301.
. Covidence systematic review software. Available at: www.covidence.org
. Accessed May 1, 2017.
. Cravero JP, Fanciullo GJ, McHugo GJ, Baird JC. The validity of the Computer Face Scale for measuring pediatric pain and mood. Paediatr Anaesth 2013;23:156–61.
. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness- implementation hybrid design: combining elements of clinical effectiveness and implementation research to enhance public health. Med Care 2012;50:217–26.
. Dampier C, Barry V, Gross HE, Lui Y, Thornburg CD, Dewalt DA, Reeve BB. Initial evaluation of the pediatric PROMIS health domains in children and adolescents with sickle cell disease. Pediatr Blood Cancer 2016;63:1031–7.
. van Dijk A, McGrath PA, Pickett W, VanDenKerkhof EG. Pain prevalence in nine- to 13-year-old schoolchildren. Pain Res Manag 2006;11:234–40.
. Donovan E, Mehringer S, Zeltzer LK. Assessing the feasibility of a web-based self-management program for adolescents with migraines and their caregivers. Clin Pediatr (Phila) 2013;52:667–70.
. Eysenbach G. What is e-health? J Med Internet Res 2001;3:2–3.
. Fales J, Palermo TM, Law EF, Wilson AC. Sleep outcomes in youth with chronic pain participating in a randomized controlled trial of online cognitive-behavioral therapy for pain management. Behav Sleep Med 2015;13:107–23.
. Fanciullo GJ, Cravero JP, Mudge BO, McHugo GJ, Baird JC. Development of a new computer method to assess children's pain. Pain Med 2007;8:S121–8.
. Fisher E, Bromberg MH, Tai G, Palermo TM. Adolescent and parent treatment goals in an internet-delivered chronic pain self-management program: does agreement of treatment goals matter? J Pediatr Psychol 2016;42:657–66.
. Flink IK, Sfyrkou C, Persson B. Customized CBT via internet for adolescents with pain and emotional distress: a pilot study. Internet Interv 2016;4:43–50.
. Fortier MA, Chung WW, Martinez A, Gago-Masague S, Sender L. Pain buddy: a novel use of m-health in the management of children's cancer pain. Comput Biol Med 2016;76:202–14.
. van Gemert-Pijnen JEWC, Nijland N, van Limburg M, Ossebaard HC, Kelders SM, Eysenbach G, Seydel ER. A holistic framework to improve the uptake and impact of eHealth technologies. J Med Internet Res 2011;13:e111.
. Gholami B, Haddad WM, Tannenbaum AR. Agitation and pain assessment using digital imaging. Conf Proc IEEE Eng Med Biol Soc 2009;2009:2176–9.
. Gipson DS, Selewski DT, Massengill SF, Wickman L, Messer KL, Herreshoff E, Corinna B, Ferris ME, Mahan JD, Greenbaum LA, MacHardy J, Kapur G, Chand DH, Goebel J, Barletta GM, Geary D, Kershaw DB, Pan CG, Gbadegesin R, Hidalgo G, Lane JC, Leiser JD, Plattner BW, Song PX, Thissen D, Liu Y, Gross HD, DeWalt DA. Gaining the PROMIS perspective from children with nephrotic syndrome: a Midwest pediatric nephrology consortium study. Health Qual Life Outcomes 2013;11:30.
. Greenhalgh T, Wherton J, Papoutsi C, Lynch J, Hughes G, A'Court C, Hinder S, Fahy N, Procter R, Shaw S. Beyond adoption: a new framework for theorising and evaluating non-adoption, abandonment and challenges to scale-up, spread and sustainability (NASSS) of health and care technologies. J Med Internet Res 2017;19:e367.
. Gulur P, Rodi SW, Washington TA, Cravero JP, Fanciullo GJ, McHugo GJ, Baird JC. Computer Face Scale for measuring pediatric pain and mood. J Pain 2009;10:173–9.
. Gupta N, Naegeli AN, Turner-Bowker DM, Flood EM, Heath LE, Mays SM, Dampier C. Cognitive testing of an electronic version of the faces pain scale-revised with pediatric and adolescent sickle cell patients. Patient 2016;9:433–43.
. Haley SM, Ni P, Dumas HM, Fragala-Pinkham MA, Hambleton RK, Montpetit K, Bilodeau N, Gorton GE, Watson K, Tucker CA. Measuring global physical health in children with cerebral palsy: illustration of a multidimensional bi-factor model and computerized adaptive testing. Qual Life Res 2009;18:359–70.
. Harrison D, Reszel J, Dagg B, Aubertin C, Bueno M, Dunn S, Fuller A, Harrold J, Larocque C, Nicholls S, Sampson M. Pain management during newborn screening: using YouTube to disseminate effective pain management strategies. J Perinat Neonatal Nurs 2017;31:172–7.
. Heiderich TM, Leslie ATFS, Guinsburg R. Neonatal procedural pain can be assessed by computer software that has good sensitivity and specificity to detect facial movements. Acta Paediatr Int J Paediatr 2015;104:e63–9.
. Heyer GL, Perkins SQ, Rose SC, Aylward SC, Lee JM. Comparing patient and parent recall of 90-day and 30-day migraine disability using elements of the PedMIDAS and an Internet headache diary. Cephalalgia 2014;34:298–306.
. Heyer GL, Rose SC. Which factors affect daily compliance with an internet headache diary among youth with migraine? Clin J Pain 2015;31:1075–9.
. Hicks CL, von Baeyer CL, McGrath PJ. Online psychological treatment for pediatric recurrent pain: a randomized evaluation. J Pediatr Psychol 2006;31:724–36.
. Higgins JPT, Green S, editors. Cochrane Handbook for Systematic Reviews of Interventions. Version 5. The Cochrane Collaboration, 2011. Available at: www.cochrane-handbook.org
. Accessed May 1, 2017.
. Hinds PS, Nuss SL, Ruccione KS, Withycombe JS, Jacobs S, Deluca H, Faulkner C, Liu Y, Cheng YI, Gross HE, Wang J, Dewalt DA. PROMIS pediatric measures in pediatric oncology: valid and clinically feasible indicators of patient-reported outcomes. Pediatr Blood Cancer 2013;60:402–8.
. Hourcade JP, Driessnack M, Huebner KE. Supporting face-to-face communication between clinicians and children with chronic headaches through a zoomable multi-touch app. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2012. pp. 2609–2618.
. Jacob E, Duran J, Stinson J, Lewis MA, Zeltzer L. Remote monitoring of pain and symptoms using wireless technology in children and adolescents with sickle cell disease. J Am Assoc Nurse Pract 2013;25:42–54.
. Jacob E, Pavlish C, Duran J, Stinson J, Lewis MA, Zeltzer L. Facilitating pediatric patient-provider communications using wireless technology in children and adolescents with sickle cell disease. J Pediatr Health Care 2013;27:284–92.
. Jacob E, Stinson J, Duran J, Gupta A, Gerla M, Lewis AM, Zeltzer L. Usability testing of a smartphone for accessing a web-based e-diary for self-monitoring of pain and symptoms in sickle cell disease. J Pediatr Hematol Oncol 2012;34:326–35.
. Jibb LA, Cafazzo JA, Nathan PC, Seto E, Stevens BJ, Nguyen C, Stinson JN. Development of a mHealth real-time pain self-management app for adolescents with cancer: an iterative usability testing study. J Pediatr Oncol Nurs 2017;34:283–94.
. Jibb LA, Stevens BJ, Nathan PC, Seto E, Cafazzo JA, Johnston DL, Hum V, Stinson JN. Implementation and preliminary effectiveness of a real-time pain management smartphone app for adolescents with cancer: a multicenter pilot clinical study. Pediatr Blood Cancer 2017;64:e26554. doi:10.1002/pbc.26554.
. King S, Chambers CT, Huguet A, MacNevin RC, McGrath PJ, Parker L, MacDonald AJ. The epidemiology of chronic pain in children and adolescents revisited: a systematic review. PAIN 2011;152:2729–38.
. Krogh AB, Larsson B, Salvesen O, Linde M. A comparison between prospective internet-based and paper diary recordings of headache among adolescents in the general population. Cephalalgia 2015;36:335–45.
. Kroon Van Diest AM, Ramsey R, Aylward B, Kroner JW, Sullivan SM, Nause K, Allen JR, Chamberlin LA, Slater S, Hommel K, Lecates SL, Kabbouche MA, O'Brien HL, Kacperski J, Hershey AD, Powers SW. Adherence to biobehavioral recommendations in pediatric migraine as measured by electronic monitoring: the Adherence in Migraine (AIM) study. Headache 2016;56:1137–46.
. de la Vega R, Miro J. mHealth: a strategic field without a solid scientific soul. A systematic review of pain-related apps. PLoS One 2014;9:e101312.
. de la Vega R, Roset R, Castarlenas E, Sanchez-Rodriguez E, Sole E, Miro J. Development and testing of painometer: a smartphone app to assess pain intensity. J Pain 2014;15:1001–7.
. Lalloo C, Jibb La, Rivera J, Agarwal A, Stinson JN. There's a pain app for that. Clin J Pain 2015;31:557–63.
. Lalloo C, Stinson JN, Brown SC, Campbell F, Isaac L, Henry JL. Pain-QuILT: assessing clinical feasibility of a web-based tool for the visual self-report of pain in and interdisciplinary pediatric chronic pain clinic. Clin J Pain 2014;30:934–43.
. Lam A. What motivates academic scientists to engage in research commercialization: “Gold,” “ribbon” or “puzzle”? Res Pol 2011;40:1354–68.
. Law EF, Beals-Erickson SE, Noel M, Claar R, Palermo TM. Pilot randomized controlled trial of internet-delivered cognitive-behavioral treatment for pediatric headache. Headache 2015;55:1410–25.
. Law EF, Murphy LK, Palermo TM. Evaluating treatment participation in an internet-based behavioral intervention for pediatric chronic pain. J Pediatr Psychol 2012;37:893–903.
. Leclercq E, Leeflang MMG, van Dalen EC, Kremer LCM. Validation of search filters for identifying pediatric studies in PubMed. J Pediatr 2013;162:629–34.e2.
. Lelieveld OT, Armbrust W, Geertzen JH, de Graaf I, van Leeuwen MA, Sauer PJ, van Weert E, Bouma J. Promoting physical activity in children with juvenile idiopathic arthritis through an internet-based program: results of a pilot randomized controlled trial. Arthritis Care Res (Hoboken) 2010;62:697–703.
. Lewandowski AS, Palermo TM, Kirchner HL, Drotar D. Comparing diary and retrospective reports of pain and activity restriction in children and adolescents with chronic pain conditions. Clin J Pain 2009;25:299–306.
. van Limburg M, van Gemert-Pijnen JEWC, Nijland N, Ossebaard HC, Hendrix RMG, Seydel ER. Why business modeling is crucial in the development of eHealth technologies. J Med Internet Res 2011;13.
. Locke W, Lynch S, Girard B. University research activity, private sector collaboration and the commercialization of research in an academic environment: Memorial University of Newfoundland as a case study. 2002. http://www.csiic.ca/PDF/Lock_report.pdf
. Accessed March 22, 2017.
. Long AC, Palermo TM. Brief report: web-based management of adolescent chronic pain: development and usability testing of an online family cognitive behavioral therapy program. J Pediatr Psychol 2009;34:511–16.
. McClellan CB, Schatz JC, Puffer E, Sanchez CE, Stancil MT, Roberts CW. Use of handheld wireless technology for a home-based sickle cell pain management protocol. J Pediatr Psychol 2009;34:564–73.
. McCormick M, Reed-Knight B, Lewis JD, Gold BD, Blount RL. Coping skills for reducing pain and somatic symptoms in adolescents with IBD. Inflamm Bowel Dis 2010;16:2148–57.
. McGuire B, Henderson E, McGrath P. Translating e-pain research into patient care. PAIN 2017;158:190–3.
. Melzack R, Wall PD. Pain mechanisms: a new theory. Science 1965;150:971–9.
. Miller K, Rodger S, Bucolo S, Greer R, Kimble RM. Multi-modal distraction. Using technology to combat pain in young children with burn injuries. Burns 2010;36:647–58.
. Miller K, Rodger S, Kipping B, Kimble RM. A novel technology approach to pain management in children with burns: a prospective randomized controlled trial. Burns 2011;37:395–405.
. Miller K, Tan X, Hobson AD, Khan A, Ziviani J, O'Brien E, Barua K, McBride CA, Kimble RM. A prospective randomized controlled trial of nonpharmacological pain management during intravenous cannulation in a pediatric emergency department. Pediatr Emerg Care 2016;32:444–51.
. Minogue V, Cooke M, Donskoy AL, Vicary P, Wells B. Patient and public involvement in reducing health and care research waste. Res Involv Engagem 2018;4:1–8.
. Moher D, Liberati A, Tetzlaff J, Altman DG. Reprint–preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Phys Ther 2009;89:873–80.
. Mott J, Bucolo S, Cuttle L, Mill J, Hilder M, Miller K, Kimble RM. The efficacy of an augmented virtual reality system to alleviate pain in children undergoing burns dressing changes: a randomised controlled trial. Burns 2008;34:803–8.
. Mulcahey MJ, Haley SM, Slavin MD, Kisala PA, Ni P, Tulsky DS, Jette AM. Ability of PROMIS pediatric measures to detect change in children with cerebral palsy undergoing musculoskeletal surgery. J Pediatr Orthop 2016;36:749–56.
. Murray E, Burns J, See Tai S, Lai R, Nazareth I. Interactive health communication applications for people with chronic disease. Cochrane Database Syst Rev 2005;4:1–82.
. Nasser M, Clarke M, Chalmers I, Brurberg KG, Nykvist H, Lund H, Glasziou P. What are funders doing to minimise waste in research? Lancet 2017;389:1006–7.
. Nieto R, Hernandez E, Boixados M, Huguet A, Beneitez I, McGrath P. Testing the feasibility of DARWeb: an online intervention for children with functional abdominal pain and their parents. Clin J Pain 2015;31:493–503.
. Noel M, Chambers CT, Parker JA, Aubrey K, Tutelman PR, Morrongiello B, Moore C, McGrath PJ, Yanchar NL, Von Baeyer CL. Boo-boos as the building blocks of pain expression: an observational examination of parental responses to everyday pain in toddlers. Can J Pain 2018;2:74–86.
. O'Conner-Von S. Preparation of adolescents for outpatient surgery: using an internet program. AORN J 2008;87:374–82.
. Palermo TM, Law EF, Fales J, Bromberg MH, Jessen-Fiddick T, Tai G. Internet-delivered cognitive-behavioral treatment for adolescents with chronic pain and their parents. PAIN 2016;157:174–85.
. Palermo TM, Law EF, Zhou C, Holley AL, Logan D, Tai G. Trajectories of change during a randomized controlled trial of internet-delivered psychological treatment for adolescent chronic pain: how does change in pain and function relate? PAIN 2015;156:626–34.
. Palermo TM, Wilson AC, Peters M, Lewandowski A, Somhegyi H. Randomized controlled trial of an internet-delivered family cognitive-behavioral therapy intervention for children and adolescents with chronic pain. PAIN 2009;146:205–13.
. Parikh SV, Huniewicz P. E-health: an overview of the uses of the Internet, social media, apps, and websites for mood disorders. Curr Opin Psychiatry 2015;28:13–17.
. Pender NJ, Walker SN, Sechrist KR, Stromborg MF. Development and testing of the health promotion model. Cardiovasc Nurs 1988;24:41–3.
. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health 2011;38:65–76.
. Robins S. The best solution is a human solution. 2014. Available at: https://suerobins.com/2014/01/18/the-best-solution-is-a-human-solution/
. Accessed December 21, 2017.
. Rogers MAM, Lemmen K, Kramer R, Mann J, Chopra V. Internet-delivered health interventions that work: systematic review of meta-analyses and evaluation of website availability. J Med Internet Res 2017;19:1–28.
. Ruland CM, Starren J, Vatne TM. Participatory design with children in the development of a support system for patient-centered care in pediatric oncology. J Biomed Inform 2008;41:624–35.
. Sanchez-Rodriguez E, de la Vega R, Castarlenas E, Roset R, Miro J. AN APP for the assessment of pain intensity: validity properties and agreement of pain reports when used with young people. Pain Med 2015;16:1982–92.
. Schatz J, Schlenz AM, McClellan CB, Puffer ES, Hardy S, Pfeiffer M, Roberts CW. Changes in coping, pain, and activity after cognitive-behavioral training: a randomized clinical trial for pediatric sickle cell disease using smartphones. Clin J Pain 2015;31:536–47.
. Selewski DT, Collier DN, MacHardy J, Gross HE, Pickens EM, Cooper AW, Bullock S, Earls MF, Pratt KJ, Scanlon K, McNeill JD, Messer KL, Lu Y, Thissen D, DeWalt DA, Gipson DS. Promising insights into the health related quality of life for children with severe obesity. Health Qual Life Outcomes 2013;11:29.
. Sikka K, Ahmed AA, Diaz D, Goodwin MS, Craig KD, Bartlett MS, Huang JS. Automated assessment of children's postoperative pain using computer vision. Pediatrics 2015;136:e124–31.
. Singal AG, Higgins PDR, Waljee AK. A primer on effectiveness and efficacy trials. Clin Transl Gastroenterol 2014;5:e45–4.
. Slater H, Campbell JM, Stinson JN, Burley MM, Briggs AM. End user and implementer experiences of mhealth technologies for noncommunicable chronic disease management in young adults: systematic review. J Med Internet Res 2017;19:e406.
. Slater H, Jordan JE, Chua J, Schütze R, Wark JD, Briggs AM. Young people's experiences of persistent musculoskeletal pain, needs, gaps and perceptions about the role of digital technologies to support their co-care: a qualitative study. BMJ Open 2016;6:e014007.
. Stevens BJ, Abbott LK, Yamada J, Harrison D, Stinson J, Taddio A, Barwick M, Latimer M, Scott SD, Rashotte J, Campbell F, Finley GA. Epidemiology and management of painful procedures in children in Canadian hospitals. CMAJ 2011;183:E403–10.
. Stinson J, Ahola Kohut S, Forgeron P, Amaria K, Bell M, Kaufman M, Luca N, Luca S, Harris L, Victor C, Spiegel L. The iPeer2Peer program: a pilot randomized controlled trial in adolescents with juvenile idiopathic arthritis. Pediatr Rheumatol Online J 2016;14:48.
. Stinson J, McGrath P, Hodnett E, Feldman B, Duffy C, Huber A, Tucker L, Hetherington R, Tse S, Spiegel L, Campillo S, Gill N, White M. Usability testing of an online self-management program for adolescents with juvenile idiopathic arthritis. J Med Internet Res 2010;12:e30.
. Stinson J, Wilson R, Gill N, Yamada J, Holt J. A systematic review of internet-based self-management interventions for youth with health conditions. J Pediatr Psychol 2009;34:495–510.
. Stinson JN, Jibb LA, Lalloo C, Feldman BM, McGrath PJ, Petroz GC, Streiner D, Dupuis A, Gill N, Stevens BJ. Comparison of average weekly pain using recalled paper and momentary assessment electronic diary reports in children with arthritis. Clin J Pain 2014;30:1044–50.
. Stinson JN, Jibb LA, Nguyen C, Nathan PC, Maloney AM, Dupuis LL, Gerstle JT, Alman B, Hopyan S, Strahlendorf C, Portwine C, Johnston DL, Orr M. Development and testing of a multidimensional iPhone pain assessment application for adolescents with cancer. J Med Internet Res 2013;15:e51.
. Stinson JN, Jibb LA, Nguyen C, Nathan PC, Maloney AM, Dupuis LL, Gerstle JT, Hopyan S, Alman BA, Strahlendorf C, Portwine C, Johnston DL. Construct validity and reliability of a real-time multidimensional smartphone app to assess pain in children and adolescents with cancer. PAIN 2015;156:2607–15.
. Stinson JN, Lalloo C, Harris L, Isaac L, Campbell F, Brown S, Ruskin D, Gordon A, Galonski M, Pink LR, Buckley N, Henry JL, White M, Karim A. iCanCope with Pain: user-centred design of a web- and mobile-based self-management program for youth with chronic pain based on identified health care needs. Pain Res Manag 2014;19:257–66.
. Stinson JN, McGrath PJ, Hodnett ED, Feldman BM, Duffy CM, Huber AM, Tucker LB, Hetherington CR, Tse SML, Spiegel LR, Campillo S, Gill NK, White ME. An internet-based self-management program with telephone support for adolescents with arthritis: a pilot randomized controlled trial. J Rheumatol 2010;37:1944–52.
. Stinson JN, Petroz GC, Stevens BJ, Feldman BM, Streiner D, McGrath PJ, Gill N. Working out the kinks: testing the feasibility of an electronic pain diary for adolescents with arthritis. Pain Res Manag 2008;13:375–82.
. Stinson JN, Stevens BJ, Feldman BM, Streiner D, McGrath PJ, Dupuis A, Gill N, Petroz GC. Construct validity of a multidimensional electronic pain diary for adolescents with arthritis. PAIN 2008;136:281–92.
. Stinson JN, Toomey PC, Stevens BJ, Kagan S, Duffy CM, Huber A, Malleson P, Grath PJMC, Yeung RSM, Feldman BM. Asking the experts: exploring the self- management needs of adolescents with arthritis. Arthritis Rheum 2008;59:65–72.
. Sun T, West N, Ansermino JM, Montgomery CJ, Myers D, Dunsmuir D, Lauder GR, von Baeyer C. A smartphone version of the Faces Pain Scale-Revised and the Color Analog Scale for postoperative pain assessment in children. Paediatr Anaesth 2015;25:1264–73.
. Tornoe B, Skov L. Computer animated relaxation therapy in children between 7 and 13 years with tension-type headache: a pilot study. Appl Psychophysiol Biofeedback 2012;37:35–44.
. Trautmann E, Kroner-Herwig B. A randomized controlled trial of internet-based self-help training for recurrent headache in childhood and adolescence. Behav Res Ther 2010;48:28–37.
. Trautmann E, Kroner-Herwig B. Internet-based self-help training for children and adolescents with recurrent headache: a pilot study. Behav Cogn Psychother 2008;36:241–5.
. Tsimicalis A, Stone PW, Bakken S, Yoon S, Sands S, Porter R, Ruland C. Usability testing of a computerized communication tool in a diverse urban pediatric population. Cancer Nurs 2014;37:E25–34.
. Vaisson G, Renaud J-S, Dugas M, Provencher T, Breton E, Chipenda Dansoho S, Colquhoun H, Fagerlin A, Giguere A, Glouberman S, Haslett L, Hoffman AS, Ivers N, Legare F, Legare J, Levin C, Lopez K, Montori VM, Sparling K, Stacey D, Trottier M-E, Volk RJ, Witteman HO. User Involvement in the Development of Patient Decisions Aids and a Validated Measure of User-Centredness. Medical Decision Making 2018;38:e77–78.
. Vanderford NL, Weiss LT, Weiss HL. A survey of the barriers associated with academic-based cancer research commercialization. PLoS One 2013;8:e72268.
. Varni JW, Magnus B, Stucky BD, Liu Y, Quinn H, Thissen D, Gross HE, Huang IC, DeWalt DA. Psychometric properties of the PROMISÂ pediatric scales: precision, stability, and comparison of different scoring and administration options. Qual Life Res 2014;23:1233–43.
. Varni JW, Stucky BD, Thissen D, Dewitt EM, Irwin DE, Lai JS, Yeatts K, Dewalt DA. PROMIS pediatric pain interference scale: an item response theory analysis of the pediatric pain item bank. J Pain 2010;11:1109–19.
. Voerman JS, Remerie S, Westendorp T, Timman R, Busschbach JJV, Passchier J, de Klerk C. Effects of a guided internet-delivered self-help intervention for adolescents with chronic pain. J Pain 2015;16:1115–26.
. Web Accessibility Initiative. Web content accessibility guidelines (WCAG) 2.0. 2008. Available at: https://www.w3.org/TR/WCAG20/
. White M, Stinson JN, Lingley-Pottie P, McGrath PJ, Gill N, Vijenthira A. Exploring therapeutic alliance with an internet-based self-management program with brief telephone support for youth with arthritis: a pilot study. Telemed J E Health 2012;18:271–6.
. World Health Organization. Building foundations for eHealth: progress of member states: report of the WHO global observatory for eHealth. Geneva: World Health Organization, 2006.
. Witteman HO, Dansokho SC, Colquhoun H, Coulter A, Dugas M, Fagerlin A, Giguere AMC, Glouberman S, Haslett L, Hoffman A, Ivers N, Légaré F, Légaré J, Levin C, Lopez K, Montori VM, Provencher T, Renaud J, Sparling K, Stacey D, Vaisson G, Volk RJ, Witteman W. User-centered design
and the development of patient decision aids: protocol for a systematic review. Syst Rev 2015;4:1–8.
. Wolpin S, Stewart M. A deliberate and rigorous approach to development of patient-centered technologies. Semin Oncol Nurs 2011;27:183–91.
. Wood C, von Baeyer C, Falinower S, Moyse D, Annequin D, Legout V. Electronic and paper versions of a faces pain intensity scale: concordance and preference in hospitalized children. BMC Pediatr 2011;11:1–9.
. Yeh ML, Hung YL, Chen HH, Lin JG, Wang YJ. Auricular acupressure combined with an internet-based intervention or alone for primary dysmenorrhea: a control study. Evid Based Complement Alternat Med 2013;2013:316212.