Journal Logo

Research Paper

Construct validity and reliability of a real-time multidimensional smartphone app to assess pain in children and adolescents with cancer

Stinson, Jennifer N.; Jibb, Lindsay A.; Nguyen, Cynthia; Nathan, Paul C.; Maloney, Anne Marie; Dupuis, L. Lee; Gerstle, J. Ted; Hopyan, Sevan; Alman, Benjamin A.; Strahlendorf, Caron; Portwine, Carol; Johnston, Donna L.

Author Information
doi: 10.1097/j.pain.0000000000000385

1. Introduction

Most children and adolescents with cancer experience acute pain, caused by the disease (eg, tumour-related pain), invasive procedures, or treatment toxicities (eg, mucositis, infection), and longer-lasting chronic pain.3,4,20,62 Pain is common across all types of childhood cancers and negatively impacts health-related quality of life (HRQL).4,5,14,44,57 Despite available and effective pain treatments, cancer pain remains undermanaged.1,18 Although there are multiple obstacles to effective pain treatment, inadequate assessment and patient reluctance to report pain are the most commonly reported barriers.1,12,15 Therefore, if pediatric cancer pain treatment is to be improved, it is necessary to obtain data on its prevalence, severity, and impact on HRQL.22,66

Pain is a complex, multidimensional subjective phenomenon, consequently self-reported assessment is recommended for school-age children and adolescents with appropriate verbal and cognitive capacity.51 The pain dimensions that are commonly measured in these children are (1) sensory (ie, intensity, quality, and location of pain), (2) affective (ie, emotional effects or unpleasantness, measureable in magnitude), and (3) evaluative (ie, interference with functioning including activities of daily living and social interactions).34

Although measures for assessing patient-reported outcomes, including pain, in children with cancer exist,13,38 they: (1) rely heavily on recall (eg, pain over last 2 weeks); (2) do not allow for prospective longitudinal assessment in naturalistic environments (eg, hospital, home, school); and (3) fail to assess the multidimensional nature of pain.48,51 Real-time data capture approaches, where patients report their status in the moment or over short periods of time (eg, hours to the end of day) using pain diaries, minimize recall bias by enabling the collection of real-time momentary data.17,53,55 Electronic pain diaries implemented using devices such as smartphones have been found to maximize adherence with reporting35,40,50,56 and improve the validity of reports, compared with paper-based approaches.49,53 They also allow for the examination of pain variability over time and within-person associations (eg, pain presence and intensity being associated with low mood32,59), and higher resolution of pain treatment response by collecting multiple assessments over time.7,30,65 For these reasons, electronic pain diaries using methods for real-time data capture have been proposed as a new standard for measurement of pain.46,53,58

Children as young as 8 years are able to use electronic diaries to provide reliable and valid data24,37,40,53; yet, no diary has been developed to capture the pain experience of pediatric cancer patients. We have previously demonstrated the usability (ie, easy to use, understandable) and feasibility (ie, high adherence and acceptability, few technical difficulties) of a real-time multidimensional cancer pain assessment diary called Pain Squad, which is available as a smartphone application (app).50 The overall goal of this research was to evaluate the psychometric properties of this app. The aim of study 1 was to determine construct validity (convergent and discriminant) and reliability (internal consistency). We also explored the experience of pediatric cancer pain including how it changed within days. The aim of study 2 was to determine responsiveness (sensitivity to change) in participants undergoing tumor resection surgery. In both studies, the feasibility (adherence and technical difficulties) and acceptability of the app was assessed.

2. Methods

2.1. Study 1

2.1.1. Design, setting, and participants

A prospective descriptive study design with repeated measures was used to test the construct validity, reliability, and feasibility of the multidimensional Pain Squad smartphone app. The study was undertaken between September 2012 and October 2014 at 4 university-affiliated pediatric tertiary care cancer centers in Canada. A convenience sampling method was used. Eligibility inclusion criteria included children and adolescents who were (1) between the ages of 8 and 18 years; (2) diagnosed with cancer; (3) undergoing treatment (chemotherapy and/or radiation); (4) self-reporters of pain (at least 1 episode within the past week); and (5) able to speak and read English. Exclusion criteria included (1) cognitive impairments; (2) major comorbid illnesses (eg, medical or psychiatric) that may interfere with their ability to complete or influence their self-reported daily pain ratings on Pain Squad; (3) receiving end-of-life care; or (4) participation in study 2 of this research project.

2.1.2. Pain Squad real-time multidimensional pain app

The Pain Squad app collects data on pain intensity, interference with a child or adolescent's life (ie, relationships, school work, sleep, mood), and unpleasantness using visual analogue scales (VASs) in an electronic format (ie, 5 cm) scored on a 0 to 10 scale. It also evaluates pain duration, pain location, and medications and other physical and psychological pain management strategies used. Example assessment items include “Touch the mark and move it to show how much pain you have right now” and “When you had pain the last 12 hours, how long did it usually last?” Supplement 1 shows all Pain Squad app assessment items (available online as Supplemental Digital Content at Users are prompted twice daily at configurable times (morning and evening) to complete 20 questions characterizing their pain. The app transmits these data to a database for aggregate reporting through an Internet-based interface. In an effort to maximize adherence to the app, the process of completing pain assessments has been “gamified,” meaning game-like elements have been added to the app to engage and motivate users.27 The gamification process involves (1) adolescents playing the role of law-enforcement officers on the “Pain Squad” and (2) an adherence-based rewards system (ie, ability to advance through the law-enforcement ranks and receive videotaped acknowledgements from actors playing the role of other officers for logging pain assessments). The app was developed to obtain twice daily pain ratings (on waking and before bed) using a signal-contingent approach. Specifically, audible alarms at fixed times were implemented such that each pain assessment had to be completed within a 30-minute window or the survey was considered missed.

2.1.3. Procedures

The study was approved by the Institutional Review Boards at the participating study sites. Site research assistants abstracted relevant socio-demographic and disease-related data from participant's medical chart, and participants completed baseline measures online. All participants were provided with an Apple iPhone 4 to collect electronic data during the study and were given a 10-minute demonstration of the Pain Squad app functionalities before study commencement. Participants were asked to complete app entries twice daily for a 2-week period. At the end of the first week, participants were asked to (1) record their current pain intensity, unpleasantness, and interference, as well as recalled least, average, and worst ratings for these pain indices over the preceding week using the Recalled Pain Inventory (RPI)53 and (2) to use the Pain Squad for the remainder of the study period. On the final day of the study, participants completed multiple measures, including the Pediatric Quality of Life Inventory (PedsQL) Inventory 4.0 (Generic Scale),63 PedsQL Cancer Module,63 and the Pain Coping Questionnaire Short-form,42 to determine construct validity of the Pain Squad app. These measures all have evidence of validity in pediatric patients.23,63 All measures, including the RPI completed at the end of the first week, were printed on paper, prepackaged, and given to participants with a set of instructions at the initial meeting. A telephone call was made to each participant after week 1 and week 2 to remind them to complete the measures. The investigator-developed Pain Squad Evaluation Questionnaire, which examined the acceptability of the app, was administered electronically the morning after completion of the last app entry. This questionnaire uses a 4-point scale to categorically rate the app on 10 criteria, including attractiveness, time required to complete, and perceived interference with other activities. The questionnaire also includes the ability to provide free-text feedback on the app and study. Each assessment required approximately 5 to 10 minutes for a participant to complete.

To test the convergent validity of the Pain Squad app, we examined correlations between weekly pain indices' scores on the Pain Squad app (averaged from all real-time VAS pain reports completed) and scores from recalled average weekly pain indices rated on the RPI and hypothesized that they would be positive in direction and moderate to high in magnitude (0.5-0.75) based on previous research.25,53 Although we argue that recalled pain ratings suffer recall bias and inaccuracies, we chose to compare the real-time VAS reports to the RPI as recalled pain assessment methods are used most commonly in clinical research.25,49,51 To test the discriminant validity of the Pain Squad app, we examined correlations between weekly pain indices' scores on the Pain Squad app (averaged from all real-time VAS pain reports completed) and overall and disease-specific HRQL using the PedsQL instruments and pain coping and hypothesized that they would be low in magnitude as demonstrated in previous research.43,45,53 To test the Pain Squad app's internal consistency, we examined the correlations between average weekly pain indices' VAS scores for weeks 1 and 2 and hypothesized that they would demonstrate a Cronbach alpha of at least 0.7.

2.1.3. Statistical analyses

Sample size analysis showed that 75 participants would be required for the lowest expected correlation coefficient of r = 0.50, with α = 0.05 and power = 80%. Furthermore, a sample size of 75 individuals would ensure a confidence interval of ±0.20 around a Cronbach alpha estimate of 0.70 (for calculation of internal consistency reliability).6 Given the clinical symptoms suffered by children and adolescents with cancer that might lead to study dropout (eg, fatigue, nausea), our recruitment goal was increased to 90 participants.

Data from the Pain Squad app, demographic data, and outcome measures were analyzed using the SAS version 9.4 statistical analysis software package. An alpha level of P < 0.01 was used to account for the multiple comparisons. Because our data set was large, the central limit theorem ensured that the sampling distributions of mean values were normally distributed and transformations or nonparametric analyses were not used. Pearson correlation coefficients were computed among average weekly pain indices' ratings on the Pain Squad app (weeks 1 and 2) and scores from the other measures to determine app validity and reliability.

Repeated-measures analysis of variance testing was used to evaluate whether the mean values on average weekly Pain Squad pain indices' ratings (dependent variables) varied across the time of day (Time Effect: morning and evening) and week (Week Effect: weeks 1 and 2). For these analyses, the basic assumptions underlying the test were also examined (ie, normality and homogeneity of variance). Although the dependent variables were positively skewed, square root transformation did not improve the distribution of the residuals; therefore, analyses were conducted using the raw data.

Adherence was calculated by examining the observed and expected number of app entries completed overall for each sample. We defined 100% adherence as 28 entries being completed by participants over 2 weeks (2 entries per day for 14 days). Random-effects logistic modeling was used to analyze the association between missing data and suspected covariates, such as gender, age, diagnosis, worst pain in past 12 hours, and average pain in past 12 hours. This method accounted for the assumption that the multiple data points collected from each participant would be correlated. Significance for this model was set at P < 0.05. Data from the Pain Squad Evaluation Questionnaire were used to examine participant's likes and dislikes of the electronic pain app. Quantitative data from this questionnaire were analyzed descriptively (ie, number and percentage of participants selecting a categorical response to a given question were calculated). Three investigators (J.N.S., L.A.J., and C.N.) reviewed qualitative data from the free-text questions on the questionnaire and organized these data into themes that reflected participant's likes and dislikes with the app.

2.2. Study 2

2.2.1. Design, setting, and participants

A prospective descriptive study design with repeated measures was used to determine the responsiveness and feasibility of the Pain Squad app. The ability of a measure to detect the overall effect of an intervention of known efficacy is commonly referred to as sensitivity to change and is considered a facet of construct validity. Commonly used methods to determine the sensitivity of a measure include comparison of F scores and effect sizes.31 We hypothesized that the Pain Squad diary would be responsive to change in pain by demonstrating significantly higher average weekly pain indices' ratings between (1) week 1 (baseline, before surgery) and week 2 (first postoperative week), and between (2) week 2 (first postoperative week) and week 3 (second postoperative week) in participants undergoing tumour resection, with effect sizes that were at least moderate in magnitude (Cohen d ≥ 0.50).9

This study was undertaken between September 2012 and October 2014 in 3 of the same settings outlined in study 1. A convenience sampling method was used. Eligibility inclusion criteria included children and adolescents who were (1) between the ages of 8 and 18 years, (2) diagnosed with cancer, (3) undergoing treatment (chemotherapy and/or radiation), (4) scheduled to undergo solid tumor resection surgery, and (5) able to speak and read English. Exclusion criteria included (1) cognitive impairments; (2) major comorbid illnesses (eg, medical or psychiatric) that may interfere with their ability to complete or influence their self-reported daily pain ratings on Pain Squad; (3) receiving end-of-life care; or (4) participation in study 2 of this research project.

2.2.2. Pain Squad real-time multidimensional pain app

The Pain Squad app has been described above under the study 1 section.

2.2.3. Procedures

Similar study procedures were used as outlined for study 1 with a few differences. Participants were asked to start using the app 1 week (7 days) before their scheduled surgery. The investigator met with each participant before the scheduled procedure to collect data using the same measures used in study 1. The investigator then asked the participant to continue to complete daily pain assessments for the next 2 weeks after their surgery, and telephone calls were made as reminders to complete outcome measures as outlined above.

2.2.4. Statistical analyses

Based on previous research,9 we calculated that to achieve 80% power to detect an effect size of 0.45 when the F test was used to test the time factor at P < 0.01 and the actual standard deviation (SD) among the mean values was 0.29, the required sample size needs to be 25. To test responsiveness, the difference between pain indices' scores between weeks 1 and 2 and weeks 2 and 3 was calculated. The dependent variables were positively skewed and transformations did not improve the distribution of the residuals; therefore, Wilcoxon signed-rank tests were used to compare values across weeks. The significance level was set at P < 0.05 for this analysis. For the within-person effect size analyses, effect size was calculated as the change in a participant's average weekly pain indices' ratings divided by the SD of the participant's baseline score in week 1. Effect size was designated as negligible (0.00-0.19), small (0.20-0.49), medium (0.50-0.79), and large (≥0.80) in magnitude of change.9 In addition, median effect sizes were calculated given that the pain indices were not normally distributed.31 Similar to study 1, adherence was calculated by examining the observed and expected number of app entries completed overall for each sample. We defined 100% adherence as 42 entries being completed by participants over 3 weeks (2 entries per day for 21 days). Data from the Pain Squad Evaluation Questionnaire were used to examine participant's likes and dislikes associated with the smartphone app according to the methods described in study 1.

3. Results

3.1. Study 1

3.1.1. Demographic and illness characteristics of the sample

The mean age of the 92 participants in study 1 was 13.1 years (SD = 2.9; range: 8-18 years), and the sample was even in terms of sex distribution (female: 48.9%; male: 51.1%). The illness characteristics of the sample are outlined in Table 1.

Table 1:
Characteristics of study participants.

3.1.2. Characteristics of cancer-related pain ratings

Participants reported mild current pain intensity (2.1 ± 2.9/10 [Mean ± SD/10]) over the course of the 2-week study period. During the 2-week study period, 4 participants (4.3%) experienced no pain while 12 (13.0%) reported pain on every entry. Examining pain interference with function, pain interfered most with how a participant felt (2.5 ± 3.4/10) and had the least impact on schoolwork (1.6 ± 3.1/10). On average, participants perceived themselves to have a moderate capacity to control experienced pain (5.9 ± 2.7/10). There was no difference between males and females with respect to current pain, but age significantly predicted increased current pain intensity and unpleasantness with older participants (≥13 years) reporting higher pain and more unpleasantness than their younger counterparts (P < 0.01 for each). When only app entries where pain was reported as present (ie, pain > 0) were considered, pain intensity (4.2 ± 2.8/10), unpleasantness (5.6 ± 2.9/10), and interference (4.4 ± 2.7/10) were moderate on average. Mean “worst,” “least,” and “average” pain in the 12 hours preceding a pain report were 6.2 ± 2.6/10, 2.8 ± 2.7/10, and 4.3 ± 2.5/10, respectively.

3.1.3. Changes in pain indices within and across days

Pain fluctuated within and across days. The main effect of the time of day for pain interference and worst pain experienced, as well as the main effect of week for pain control were significant (P < 0.01 for all). Participants reported significantly higher levels of pain interference and worst pain experienced in the evening compared with the morning, and significantly more control over pain in week 1 compared with week 2. There was no change in pain from week 1 to week 2 in terms of pain indices with respect to real-time pain reports collected using the Pain Squad app. However, participants were asked at the end of the study to think about their pain over the last 7 days and to compare it with the week before in terms of whether they thought it had changed. While 21.4% of participants reported their pain as unchanged, 55.0% thought it was “a little” or “much better” and 23.7% perceived their pain to be “a little” or “much worse.”

3.1.4. Construct validity of the Pain Squad app Correlations between pain ratings on the Pain Squad app and other theoretically relevant constructs

The mean values, SDs, and correlations for average weekly pain indices' ratings on the Pain Squad app, and the same indices from the RPI are outlined in Table 2. As predicted, all of the correlations were statistically significant (P < 0.0001) with the magnitude of all correlations being in the moderate to high range (0.43-0.68), demonstrating the convergent validity of the Pain Squad app. The mean values, SDs, and correlations for scores on PedsQL Generic Inventory, PedsQL Cancer Module, and the Pain Coping Questionnaire, and dimensions of the Pain Squad app are outlined in Table 3. As predicted, correlations between average weekly pain indices' VAS scores on the Pain Squad app and scores from generic HRQL (−0.20 to −0.46), disease-specific HRQL (−0.12 to −0.28), and pain coping (0.25-0.29) were low in magnitude, demonstrating the discriminant validity of the Pain Squad app.

Table 2:
Correlation between weekly pain intensity, unpleasantness, and interference ratings on the Pain Squad app and the RPI.
Table 3:
Correlation between weekly pain intensity, unpleasantness, and interference ratings on the Pain Squad app, and for scores on the PedsQL Generic Inventory, PedsQL Cancer Module, and Pain Coping Questionnaire.

3.1.5. Reliability of the Pain Squad app

The internal consistency reliability of the Pain Squad app was demonstrated as a standardized Cronbach α of 0.96 for both week 1 and week 2. There was a negligible impact of deleting key variables from the scale on Cronbach α (Table 4).

Table 4:
Cronbach coefficient α for the Pain Squad app.

3.2. Study 2

3.2.1. Demographic and illness characteristics of the sample

The mean age of the 14 participants in study 2 was 14.8 years (SD = 2.8; range: 9-18 years), and the sample was even in terms of sex distribution (female: 50.0%; male: 50.0%). The illness characteristics of the sample are outlined in Table 1.

3.2.2. Characteristics of cancer-related pain ratings

On average, participants reported mild levels of pain intensity (1.6 ± 2.3/10), interference (1.6 ± 2.3/10), and unpleasantness (1.9 ± 2.9/10) before surgery. Seven participants reported no pain during the week before surgery. Changes in pain intensity, unpleasantness, and interference scores across the 3-week study period are addressed below under the ability of the measure to detect changes after surgery sections. Pain intensity and unpleasantness reports were on average higher in the week after surgery and then decreased in the subsequent week.

3.2.3. Ability of the Pain Squad app to detect changes after surgery using F scores

The effect of week was not significant for pain intensity (P = 0.06), and specifically the increase in pain intensity from week 1 to week 2 was not statistically significant (P = 0.07). However, the decrease in pain intensity from week 2 to week 3 was significant (P = 0.03). The time of day was not associated with pain intensity (P = 0.22). The effect of week was also not significant for pain unpleasantness (P = 0.05), but unpleasantness increased significantly from week 1 to week 2 (P = 0.02). Unpleasantness did not change between week 2 and week 3 (P = 0.20). Participants reported pain interference to be significantly increased from week 1 to week 2 (P = 0.0051); however, the magnitude of decrease in pain interference from week 2 to week 3 was not statistically significant (P = 0.17).

3.2.4. Ability of the Pain Squad app to detect changes after surgery using effect sizes

The descriptive statistics for the effect sizes are outlined in Table 5. All of the median effect sizes for the pain indices from week 1 to 2 were greater than 0.85. The median effect sizes for the pain indices from week 1 to 3 were between 0.13 and 0.32. The highest effect sizes were found for pain interference ratings at both the first and second week postsurgery. A number of the effect size calculations, particularly between week 1 and week 3, were not significant (demonstrated as 95% confidence intervals straddling 0), but this may be a reflection of the small study sample size. These findings provide preliminary evidence that the multidimensional electronic pain app is able to detect changes in pain in children and adolescents with cancer undergoing surgery.

Table 5:
Sensitivity of the Pain Squad app in detecting changes in pain after solid tumor resection surgery.

3.3. Feasibility of the Pain Squad app

3.3.1. Adherence

Overall adherence with the 2- and 3-week study protocols was 72.2 ± 23.1% and 47.4 ± 25.2%, respectively. On average, adherence was not significantly different during the week compared with the weekend. Adherence was slightly higher in week 1 (77.9% ± 20.9%) compared with week 2 (66.5% ± 29.6%) in study 1 and higher in week 1 (53.6% ± 27.7%) compared with weeks 2 (45.9 ± 38.2%) and 3 (42.9% ± 34.7%) in study 2. Adherence was not affected by gender, age, or diagnosis. Random-effects modeling using study 1 data showed that worst pain in the past 12 hours and average pain in past 12 hours were significantly associated with increased missing data (P < 0.05 for both).

3.3.2. Acceptability (likes and dislikes)

Seventy-four participants (80.4%) and 12 participants (85.7%) completed the Pain Squad Evaluation Questionnaire from study 1 and study 2, respectively. The majority of participants in study 1 (n = 52; 70.2%) found it “easy” or “very easy” to remember to complete the app twice daily. In study 2, 4 participants (33.3%) found it easy to remember to complete the app. Most participants liked the way the app looked (study 1: n = 67, 90.5%; study 2: n = 8, 66.7%) and rated it quick to complete (study 1: n = 69, 93.2%; study 2: n = 11, 91.7%). The majority of participants found that completing the app interfered minimally with activities and friends (study 1: n = 70, 94.6%; study 2: n = 11, 91.7%). Most participants were willing to use the app again for at least 2 weeks (study 1: n = 47, 63.5%; study 2: n = 10, 83.3%), whereas the remaining participants were willing to use it for a longer period of time. Review of the qualitative responses to the free-text questions in the Pain Squad Evaluation Questionnaire indicated the general endorsement of the app across study 1 and study 2. Specific participant comments included “Thank you for doing this!” and “It was fun to use.” With regard to recommendations for app improvements, participants commented that the 30-minute window provided to complete an assessment was insufficient. Some participants suggested that a larger window of time would be preferable.

4. Discussion

This is the first study to evaluate the validity and reliability of a smartphone-based real-time data capture app to record pain intensity, unpleasantness, and pain interference with function in children and adolescents undergoing cancer therapy. The Pain Squad app has evidence of construct validity and seems able to detect changes in pain related to surgery. Completion of electronic pain ratings twice daily through the app across hospital and home settings is feasible and highly acceptable in the pediatric oncology population.

Examination of the construct validity of the Pain Squad app revealed that it has good convergent validity, demonstrated by pain index VAS ratings being moderately to highly correlated with RPI pain indices on all pairwise comparisons. Similarly, moderate to high correlations between real-time data capture and recalled pain reports have been demonstrated in adolescents with juvenile idiopathic arthritis (0.49-0.84)53 and adult patients with cancer (0.65-0.84).25 The app also demonstrated good discriminate validity (low correlation with overall generic and disease-specific HRQL and pain coping), which has also been previously demonstrated in youth with juvenile idiopathic arthritis.53 Taken together, these findings support the construct validity of the Pain Squad app. Results of the reliability analysis for the Pain Squad app demonstrated that the app has excellent internal consistency, shown in each of the 2 weeks of study 1. Deletion of variables from the scale did not change the app's internal consistency, suggesting that no given variable is more or less important to determine the scale's reliability and provide rationale for retaining all Pain Squad assessment items.

In addition, the Pain Squad app demonstrated evidence of sensitivity to change over 3 weeks as pain indices' scores increased on average by greater than 3-fold from baseline to 1 week after tumor resection surgery and then decreased to near baseline values by the second week of follow-up. The median effect sizes for each pain index from baseline to the first week postsurgery were large, whereas the effect sizes from baseline to the second week postsurgery were small. Despite the small sample size for study 2, these results provide early evidence of the responsiveness of the Pain Squad app to change over time.

To date, limited empirical data have been generated in exploring the constructs of pain intensity, unpleasantness, and interference in children and adolescents with cancer over time. Furthermore, no research has explored the relationships between these pain indices. Participants in this study reported average pain that was mild in nature. However, when only considering reports where pain was rated as >0/10, current pain indices' ratings were moderate on average and worst pain in the 12-hour period preceding an app report was severe. These findings suggest considerable variation with respect to pain over time in the pediatric cancer population, which was also reflected as significantly higher pain interference and worst pain ratings in the evening app reports as compared with morning app reports. The few longitudinal studies that have been conducted to date have indicated that cancer-related pain does vary over time in children and adolescents.15,16,60–62,64 The protocols for these studies however often included infrequent sampling, whereby pain reports were collected weeks to months apart. These sampling strategies prohibit examination of the pain experience of pediatric cancer patients within days. The ability of electronic diaries, such as the Pain Squad app, to record pain at a frequency dictated by a researcher, clinician, or patient is therefore advantageous in exploring the complex within-day patterns of pediatric cancer pain and provides evidence of the incremental validity of real-time approaches over recalled methods.

Additionally, we found that within-participant increases in pain intensity were significantly related to increases in pain unpleasantness and interference with function. Although the concepts of pain unpleasantness and interference have not been well explored in the pediatric cancer population, unrelieved pain is known to coincide with impaired sleep and daytime fatigue.4 Pain has been shown to result in feelings of distress and fear.21,67 These findings highlight the wide-reaching impact of pain on the lives of pediatric cancer patients, as well as the need for multidimensional pain assessment and rapid management to minimize pain's impact on HRQL.

Presently, smartphone apps aimed at assessing and managing a variety of health conditions, within a variety of patient populations, have been developed. Specifically, targeted conditions include alcoholism, diabetes, pain, and posttraumatic stress disorder.8,10,19,28,29 Because the field of app-based health assessment and management is burgeoning, the number of rigorous examinations of the effect of smartphone app interventions on patient, provider, and system outcomes is few. Still, programs that have used older electronic methods, such as the Internet through computers, to deliver health care interventions have undergone more extensive examination. The application of these programs, including in the area of pain, has resulted in positive effects across a range of health outcomes including HRQL, symptom exacerbation, and disease knowledge.11,33,36,41,47,52 Furthermore, potential positive feasibility outcomes from these electronic programs include high rates of adherence, low rates of study attrition, and high ratings of acceptability.2,33,41,52 In this study, we have been able to show feasibility of use of the Pain Squad app. This feasibility, coupled with the previously demonstrated effectiveness and acceptability of electronic interventions across a breadth of illnesses, suggests that, with modifications to assessment questions, the Pain Squad app could been used to assess pain and other symptoms in a wide variety of health conditions.

The major strength of this study lies in its use of an electronic real-time data capture approach to assess the pediatric cancer pain experience. However, several limitations of this study should be considered. First, the results of the real-time assessment of any construct are dependent on the sampling strategy used. The Pain Squad app collected pain data in the morning and evening, and we therefore may have missed important fluctuations in pain with the signal-contingent (ie, alarm-driven) approach used. More frequent sampling or event-contingent reporting (eg, reporting whenever pain was present) may have reduced this potential bias.35,54 These sampling strategies are being piloted in other work in our laboratory.26 Additionally, we were not successful in recruiting the desired number of participants for study 2. Many participants were scheduled for surgery with less than a week between our first contact with them and the surgery. Because we had designed the study to have a strong baseline of pain reports before the known-painful event, these patients were ineligible. Secondly, a number of patients did not have a confirmed cancer diagnosis before the surgery and thus were ineligible. Still, study 2 results indicate that the app is acceptable to children with cancer and has evidence of sensitivity to change. Future research should focus on examining app sensitivity in larger samples or samples with different sources of pain (eg, treatment-induced mucositis). Finally, study 2 participants failed to complete a large number of pain assessments over the 3-week period (52.6%). The peri-surgical period is often anxiety-provoking, disabling, and painful for children.39 Given the importance of pain assessment to effective pain management, qualitative interviews with young cancer patients should be undertaken to better understand how pain reporting could be made more amenable during this period.

Pain is a significant problem for children and adolescents with cancer, shows considerable within-person variability, and negatively impacts HRQL. Current methods for evaluating pain in children with cancer suffer methodological problems (eg, recall bias). Our results indicate that the Pain Squad app is a valid, reliable, and feasible means to prospectively collect patient-reported data on pain in children and adolescents. The establishment of psychometrically sound and feasible approaches to real-time pain assessment is crucial to our understanding of the day-to-day symptom experiences of pediatric cancer patients, which in turn is needed to optimize the cancer care delivered to these children and adolescents and to ultimately decrease suffering and improve their HRQL. Furthermore, the model of real-time pain assessment described can be used in both clinical and research contexts to evaluate the effectiveness of a wide range of pain treatments, including pharmacological, physical, and psychological modalities.

Conflict of interest statement

The authors have no conflicts of interest to declare.


This research project was conducted with funding support from C17 and funded by Kids With Cancer Society, Childhood Cancer Canada Foundation, and the Coast to Coast Against Cancer Foundation.

Supplemental Digital Content

Supplemental Digital Content associated with this article can be found online at;


[1]. Ameringer S. Barriers to pain management among adolescents with cancer. Pain Manage Nurs 2010;11:224–33.
[2]. Andrews G, Cuijpers P, Craske MG, McEvoy P, Titov N. Computer therapy for the anxiety and depressive disorders is effective, acceptable and practical health care: a meta-analysis. PLoS One 2010;5:e13196.
[3]. Anghelescu DL, Faughnan LG, Jeha S, Relling MV, Hinds PS, Sandlund JT, Cheng C, Pei D, Hankins G, Pauley JL, Pui CH. Neuropathic pain during treatment for childhood acute lymphoblastic leukemia. Pediatr Blood Cancer 2011;57:1147–53.
[4]. Baggott C, Dodd M, Kennedy C, Marina N, Matthay KK, Cooper BA, Miaskowski C. Changes in children's reports of symptom occurrence and severity during a course of myelosuppressive chemotherapy. J Pediatr Oncol Nurs 2010;27:307–15.
[5]. Bhat SR, Goodwin TL, Burwinkle TM, Lansdale MF, Dahl GV, Huhn SL, Gibbs IC, Donaldson SS, Rosenblum RK, Varni JW, Fisher PG. Profile of daily life in children with brain tumors: an assessment of health-related quality of life. J Clin Oncol 2005;23:5493–500.
[6]. Bland JM, Altman DG. Statistics notes: Cronbach's alpha. BMJ 1997;314:572.
[7]. Broderick JE, Schwartz JE, Schneider S, Stone AA. Can end-of-day reports replace momentary assessment of pain and fatigue? J Pain 2009;10:274–81.
[8]. Cafazzo JA, Casselman M, Hamming N, Katzman DK, Palmert MR. Design of an mHealth app for the self-management of adolescent type 1 diabetes: a pilot study. J Med Internet Res 2012;14:e70.
[9]. Cohen J. Statistical power analysis for the behavioral sciences. 2nd ed. New York: Lawrence Urlbaum Associates, 1998.
[10]. Cohn AM, Hunter-Reel D, Hagman BT, Mitchell J. Promoting behavior change from alcohol use through mobile technology: the future of ecological momentary assessment. Alcohol Clin Exp Res 2011;35:2209–15.
[11]. Cuijpers P, van Straten A, Andersson G. Internet-administered cognitive behavior therapy for health problems: a systematic review. J Behav Med 2008;31:169–77.
[12]. Czarnecki ML, Simon K, Thompson JJ, Armus CL, Hanson TC, Berg KA, Petrie JL, Xiang Q, Malin S. Barriers to pediatric pain management: a nursing perspective. Pain Manage Nurs 2011;12:154–62.
[13]. Dupuis LL, Ethier MC, Tomlinson D, Hesser T, Sung L. A systematic review of symptom assessment scales in children with cancer. BMC Cancer 2012;12:1.
[14]. Dupuis LL, Milne-Wren C, Cassidy M, Barrera M, Portwine C, Johnston DL, Silva MP, Sibbald C, Leaker M, Routh S, Sung L. Symptom assessment in children receiving cancer therapy: the parents' perspective. Support Care Cancer 2010;18:281–99.
[15]. Fortier MA, Wahi A, Bruce C, Maurer EL, Stevenson R. Pain management at home in children with cancer: a daily diary study. Pediatr Blood Cancer 2014;61:1029–33.
[16]. Gedaly-Duff V, Lee KA, Nail L, Nicholson HS, Johnson KP. Pain, sleep disturbance, and fatigue in children with leukemia and their parents: a pilot study. Oncol Nurs Forum 2006;33:641–6.
[17]. Gendreau M, Hufford MR, Stone AA. Measuring clinical pain in chronic widespread pain: selected methodological issues. Best Pract Res Clin Rheumatol 2003;17:575–92.
[18]. Greco MT, Roberto A, Corli O, Deandrea S, Bandieri E, Cavuto S, Apolone G. Quality of cancer pain management: an update of a systematic review of under treatment of patients with cancer. J Clin Oncol 2014;32:4149–54.
[19]. Hebden L, Cook A, van der Ploeg HP, Allman-Farinelli M. Development of smartphone applications for nutrition and physical activity behavior change. JMIR Res Protoc 2012;1:e9.
[20]. Hedén L, Pöder U, von Essen L, Ljungman G. Parents “perceptions of their child”s symptom burden during and after cancer treatment. J Pain Symptom Manage 2013;46:366–75.
[21]. Hedén L, von Essen L, Ljungman G. The relationship between fear and pain levels during needle procedures in children from the parents' perspective. Eur J Pain 2015.
[22]. Hinds PS, Nuss SL, Ruccione KS, Withycombe JS, Jacobs S, DeLuca H, Faulkner C, Liu Y, Cheng YI, Gross HE, Wang J, DeWalt DA. PROMIS pediatric measures in pediatric oncology: valid and clinically feasible indicators of patient-reported outcomes. Pediatr Blood Cancer 2013;60:402–8.
[23]. Huguet A, Miró J, Nieto R. The factor structure and factorial invariance of the Pain-Coping Questionnaire across age: evidence from community-based samples of children and adults. Eur J Pain 2009;13:879–89.
[24]. Jacob E, Pavlish C, Duran J, Stinson J, Lewis MA, Zeltzer L. Facilitating pediatric patient-provider communications using wireless technology in children and adolescents with sickle cell disease. J Pediatr Health Care 2013;27:284–92.
[25]. Jensen MP, Wang W, Potts SL, Gould EM. Reliability and validity of individual and composite recall pain measures in patients with cancer. Pain Med 2012;13:1284–91.
[26]. Jibb LA, Stevens BJ, Nathan PC, Seto E, Cafazzo JA, Stinson JN. A smartphone-based pain management app for adolescents with cancer: establishing system requirements and a pain care algorithm based on literature review, interviews, and consensus. JMIR Res Protoc 2014;3:e15.
[27]. King D, Greaves F, Exeter C, Darzi A. “Gamification”: influencing health behaviours with games. J R Soc Med 2013;106:76–8.
[28]. Kuhn E, Greene C, Hoffman J, Nguyen T, Wald L, Schmidt J, Ramsey KM, Ruzek J. Preliminary evaluation of PTSD coach, a smartphone app for post-traumatic stress symptoms. Mil Med 2014;179:12–18.
[29]. Lalloo C, Jibb LA, Rivera J, Agarwal A, Stinson JN. “There's a pain app for that”: review of patient-targeted smartphone applications for pain management. Clin J Pain 2015;31:557–63.
[30]. Lane SJ, Heddle NM, Arnold E, Walker I. A review of randomized controlled trials comparing the effectiveness of hand held computers with paper methods for data collection. BMC Med Inform Decis Mak 2006;6.
[31]. Liang MH, Lew RA, Stucki G, Fortin PR, Daltroy L. Measuring clinically important changes with patient-oriented questionnaires. Med Care 2002;40:II45–51.
[32]. Lin CC, Lai YL, Ward SE. Effect of cancer pain on performance status, mood states, and level of hope among taiwanese cancer patients. J Pain Symptom Manage 2003;25:29–37.
[33]. Lorig KR, Ritter PL, Laurent DD, Plant K. The internet‐based arthritis self‐management program: a one‐year randomized trial for patients with arthritis or fibromyalgia. Arthritis Rheum 2008;59:1009–17.
[34]. Melzack R. The McGill pain questionnaire: from description to measurement. Anesthesiology 2005;103:199–202.
[35]. Morren M, van Dulmen S, Ouwerkerk J, Bensing J. Compliance with momentary pain measurement using electronic diaries: a systematic review. Eur J Pain 2009;13:354–65.
[36]. Murray E, Burns J, See TS, Lai R, Nazareth I. Interactive health communication applications for people with chronic disease. Cochrane Database Syst Rev 2005;4:CD004274.
[37]. O'Sullivan C, Dupuis LL, Gibson P, Johnston DL, Baggott C, Portwine C, Spiegler B, Kuczynski S, Tomlinson D, de Mol Van Otterloo S, Tomlinson GA, Sung L. Refinement of the symptom screening in pediatrics tool (SSPedi). Br J Cancer 2014;111:1262–8.
[38]. O'Sullivan C, Dupuis LL, Sung L. A review of symptom screening tools in pediatric cancer patients. Curr Opin Oncol 2015;27:285–90.
[39]. Pagé MG, Campbell F, Isaac L, Stinson J, Martin-Pichora AL, Katz J. Reliability and validity of the Child Pain Anxiety Symptoms Scale (CPASS) in a clinical sample of children and adolescents with acute postsurgical pain. PAIN 2011;152:1958–65.
[40]. Palermo TM, Valenzuela D, Stork PP. A randomized trial of electronic versus paper pain diaries in children: impact on compliance, accuracy, and acceptability. PAIN 2004;107:213–19.
[41]. Palermo TM, Wilson AC, Peters M, Lewandowski A, Somhegyi H. Randomized controlled trial of an Internet-delivered family cognitive-behavioral therapy intervention for children and adolescents with chronic pain. PAIN 2009;146:205–13.
[42]. Reid GJ, Gilbert CA, McGrath PJ. The pain coping questionnaire: preliminary validation. PAIN 1998;76:83–96.
[43]. Ruccione K, Lu Y, Meeske K. Adolescents' psychosocial health-related quality of life within 6 months after cancer treatment completion. Cancer Nurs 2013;36:E61–72.
[44]. Ruland CM, Hamilton G, Schjoslashdt-Osmo B. The complexity of symptoms and problems experienced in children with cancer: a review of the literature. J Pain Symptom Manage 2009;37:403–18.
[45]. Russell L, Gough K, Drosdowsky A, Schofield P, Aranda S, Butow PN, Westwood JA, Krishnasamy M, Young JM, Phipps-Nelson J, King D, Jefford M. Psychological distress, quality of life, symptoms and unmet needs of colorectal cancer survivors near the end of treatment. J Cancer Surviv 2015;9:1–9.
[46]. Schneider S, Stone AA. Distinguishing between frequency and intensity of health-related symptoms from diary assessments. J Psychosom Res 2014:205–12.
[47]. Spek V, Cuijpers P, Nyklícek I, Riper H, Keyzer J, Pop V. Internet-based cognitive behaviour therapy for symptoms of depression and anxiety: a meta-analysis. Psychol Med 2007;37:319–28.
[48]. Stinson JN. Improving the assessment of pediatric chronic pain: harnessing the potential of electronic diaries. Pain Res Manag 2009;14:59–64.
[49]. Stinson JN, Jibb LA, Lalloo C, Feldman BM, McGrath PJ, Petroz GC, Streiner D, Dupuis A, Gill N, Stevens BJ. Comparison of average weekly pain using recalled paper and momentary assessment electronic diary reports in children with arthritis. Clin J Pain 2014;30:1044–50.
[50]. Stinson JN, Jibb LA, Nguyen C, Nathan PC, Maloney AM, Dupuis LL, Gerstle JT, Alman B, Hopyan S, Strahlendorf C, Portwine C, Johnston DL, Orr M. Development and testing of a multidimensional iPhone pain assessment application for adolescents with cancer. J Med Internet Res 2013;15:e51.
[51]. Stinson JN, Kavanagh T, Yamada J, Gill N, Stevens B. Systematic review of the psychometric properties, interpretability and feasibility of self-report pain intensity measures for use in clinical trials in children and adolescents. PAIN 2006;125:143–57.
[52]. Stinson JN, McGrath PJ, Hodnett E, Feldman BM, Duffy CM, Huber A, Tucker L, Hetherington CR, Tse SM, Spiegel LR, Campillo S, Gill N, White ME. An internet-based self-management program with telephone support for adolescents with arthritis: a pilot randomized controlled trial. J Rheumatol 2010;37:1944–52.
[53]. Stinson JN, Stevens BJ, Feldman BM, Streiner D, McGrath PJ, Dupuis A, Gill N, Petroz GC. Construct validity of a multidimensional electronic pain diary for adolescents with arthritis. PAIN 2008;136:281–92.
[54]. Stone AA, Broderick JE, Schneider S, Schwartz JE. Expanding options for developing outcome measures from momentary assessment data. Psychosom Med 2012;74:387–97.
[55]. Stone AA, Broderick JE, Schwartz JE, Shiffman S, Litcher-Kelly L, Calvanese P. Intensive momentary reporting of pain with an electronic diary: reactivity, compliance, and patient satisfaction. PAIN 2003;104:343–51.
[56]. Stone AA, Shiffman S, Schwartz JE, Broderick JE, Hufford MR. Patient non-compliance with paper diaries. BMJ 2002;324:1193–4.
[57]. Sung L, Klaassen RJ, Dix D, Pritchard S, Yanofsky R, Dzolganovski B, Almeida R, Klassen A. Identification of paediatric cancer patients with poor quality of life. Br J Cancer 2009;100:82–8.
[58]. Tomlinson D, Dupuis LL, Gibson P, Johnston DL, Portwine C, Baggott C, Zupanec S, Watson J, Spiegler B, Kuczynski S, Macartney G, Sung L. Initial development of the symptom screening in pediatrics tool (SSPedi). Support Care Cancer 2013;21:71–5.
[59]. Torta RGV, Munari J. Symptom cluster: depression and pain. Surg Oncol 2010;19:155–9.
[60]. Van Cleve L, Bossert E, Beecroft P, Adlard K, Alvarez O, Savedra MC. The pain experience of children with leukemia during the first year after diagnosis. Nurs Res 2004;53:1–10.
[61]. Van Cleve L, Munoz CE, Riggs ML, Bava L, Savedra M. Pain experience in children with advanced cancer. J Pediatr Oncol Nurs 2012;29:28–36.
[62]. Varni JW, Burwinkle TM, Katz ER. The PedsQL in pediatric cancer pain: a prospective longitudinal analysis of pain and emotional distress. J Dev Behav Pediatr 2004;25:239–46.
[63]. Varni JW, Burwinkle TM, Katz ER, Meeske K, Dickinson P. The PedsQL in pediatric cancer: reliability and validity of the pediatric quality of life inventory generic core scales, multidimensional fatigue scale, and cancer module. Cancer 2002;94:2090–106.
[64]. Walker AJ, Gedaly-Duff V, Miaskowski C, Nail L. Differences in symptom occurrence, frequency, intensity, and distress in adolescents prior to and one week after the administration of chemotherapy. J Pediatr Oncol Nurs 2010;27:259–65.
[65]. Walther B, Hossin S, Townend J, Abernethy N, Parker D, Jeffries D. Comparison of electronic data capture (EDC) with the standard data capture method for clinical trial data. PLoS One 2011;6:e25348.
[66]. Wilson M. Integrating the concept of pain interference into pain management. Pain Manage Nurs 2014;15:499–505.
[67]. Windich-Biermeier A, Sjoberg I, Dale JC, Eshelman D, Guzzetta CE. Effects of distraction on pain, fear, and distress during venous port access and venipuncture in children and adolescents with cancer. J Pediatr Oncol Nurs 2007;24:8–19.

Pediatrics; Cancer pain; Measurement; Psychometric testing; Real-time data capture; Smartphone app

© 2015 International Association for the Study of Pain