Preliminary Efficacy of Online Traumatic Brain Injury Professional Development for Educators: An Exploratory Randomized Clinical Trial : The Journal of Head Trauma Rehabilitation

Secondary Logo

Journal Logo

Pediatric TBI 2019

Preliminary Efficacy of Online Traumatic Brain Injury Professional Development for Educators: An Exploratory Randomized Clinical Trial

Glang, Ann E. PhD; McCart, Melissa EdD; Slocumb, Jody BA; Gau, Jeff M. MS; Davies, Susan C. EdD; Gomez, Doug MS; Beck, Laura MLIS

Author Information
Journal of Head Trauma Rehabilitation 34(2):p 77-86, March/April 2019. | DOI: 10.1097/HTR.0000000000000447
  • Free

Abstract

EACH YEAR APPROXIMATELY 700 000 US children aged 0 to 19 years sustain a traumatic brain injury (TBI) requiring hospitalization or emergency treatment.1 These children are at risk for a range of disabilities that impair academic performance and their transition to post–secondary education and employment.2–9 Children with moderate and severe injuries are likely to have cognitive, behavioral, and social difficulties that affect their long-term quality of life.10 Even mild injuries to a developing brain can result in persistent neural alterations11–13 that significantly affect social and educational functioning.14 Increasingly, it is recognized that most children with TBI receive rehabilitation services in the school setting.15,16 A range of evidence-based assessment and instructional approaches can help mitigate the academic and behavioral challenges associated with TBI and the long-term problems that can follow these children into adulthood.17

Unfortunately, few educator preparation programs include training on TBI in their coursework. This gap in training spans all of the disciplines serving students with TBI. For example, school psychologists are responsible for assessing, identifying, and planning educational interventions to address learning and behavior problems and are uniquely positioned to support students with TBI, often providing guidance to educators about the most appropriate services for meeting a student's educational needs.18–20 However, a national survey of 42 school-psychology program directors, representing 22% of the programs approved by the National Association of School Psychologists, found that none of the school-psychology graduate programs devoted a specific course to TBI, and most programs provided only 61 to 90 minutes of total instruction on the topic.18 A similar survey of 156 faculty members at 100 randomly selected teacher training programs found that TBI-specific training was minimal in both undergraduate general education programs and special education programs.18

Multiple surveys indicate that school professionals lack the knowledge and skills needed to adequately support students with TBI.21–25 Findings from a national survey of special and general education teachers suggest that teachers continue to have misconceptions and significant knowledge gaps about TBI.23 In that survey of more than 350 general and special education teachers, the average knowledge score was 55.9%, with special education teachers, those with TBI training, and those with more experience scoring higher.23 Surveys of speech/language pathologists26 reveal that they also have limited knowledge about assessments and interventions for students affected by TBI.19 All those survey results suggest inadequate preparation across all the professions charged with supporting students in the school setting. Lack of knowledge has led to the misidentification and underidentification of students with TBI, potentially leaving this group of students with disabilities significantly underserved.27

Several recent articles outline guidelines for effectively delivering services to students with TBI.28–30 Each of these sets of guidelines includes a recommendation for professional development for school personnel. Given the national attention on “Return to Learn” after TBI and ongoing interest in effective supports for students with TBI in the school setting,29,30 there is a critical need for research on building the capacity of educators to serve students with TBI and improve their outcomes.

This article reports the findings of a randomized controlled trial of an online TBI professional development intervention, In the Classroom After Concussion: Best Practices for Student Success. The program includes brain injury training and resources for educators. The study's hypotheses were as follows:

  • (1) Use of the In the Classroom training by educators will result in increased knowledge of the effects of brain injury compared with controls.
  • (2) Use of the In the Classroom training will result in increased educator knowledge of effective educational strategies for supporting students with TBI compared with controls.
  • (3) Use of the In the Classroom training will result in increased self-efficacy in supporting students with TBI in the school setting.

METHODS

Participants

Participants were recruited through contacts in school districts in Oregon and Ohio. Criteria for participation were 18 years of age or older, English speaking, and working as a general education classroom teacher in a school setting. The study was reviewed by the University of Oregon's Institutional Review Board.

Sample size was determined a priori using conservative assumptions based on our pilot test of the In the Classroom training (N = 30) to detect medium condition effects of d = 0.49 or greater. One hundred educators participated in the study (see Figure 1). A summary of demographic characteristics is provided in Table 1.

F1
Figure 1.:
Consort flowchart.
TABLE 1 - Demographic characteristics by study condition
Control (N = 50) Treatment (N = 50)
N % N %
Female 35 70.0 40 80.0
Hispanic 0 0.0 0 0.0
Race
American Indian 0 0.0 3 6.0
Native Hawaiian or Pacific Islander 1 2.0 0 0.0
Black or African American 1 2.0 1 2.0
White or Caucasian 45 90.0 45 90.0
Mixed race 2 4.0 1 2.0
Unknown 1 2.0 0 0.0
Highest level of education completed
Some college 0 0.0 1 2.0
College degree (BS, BS) 14 28.0 13 26.5
Graduate school (MA, MS, PhD) 36 72.0 35 71.4
Annual household income
≤$19 999 2 4.0 2 4.0
$20 000-$39 999 1 2.0 1 2.0
$40 000-$59 999 12 24.0 10 20.0
$60 000-$79 999 14 28.0 15 30.0
$80 000-$99 999 8 16.0 8 16.0
>$100 000 13 26.0 14 28.0

Measures

TBI knowledge

The TBI knowledge items measured knowledge of effective strategies for working with students with TBI in school settings. The knowledge domain used a 4-item multiple-choice response format, with items derived from the In the Classroom training program. The knowledge score represented the proportion of correctly answered items. Items were pilot tested with a sample of 18 educators, and 2-week test-retest data were collected with a separate sample of 44 educators. The test-retest reliability for the TBI knowledge survey was r = 0.64.

TBI knowledge application

Knowledge application was assessed using 5 text scenarios and 2 video scenarios depicting a teacher responding to a classroom situation with a student with TBI. For each text-based knowledge application item, participants were asked to determine how effective the teacher's response was using a 4-item multiple-choice format. The video-based knowledge application items were assessed using an open-response question format asking what suggestions respondents had to offer to help the teacher in the video scenario. Respondents received 1 point for each evidence-based behavioral or instructional strategy mentioned in their response (1 video-based item had a possible total of 3, the other had 5). To create a code list and come to agreement about codification processes, 4 of the authors coded the first 20 of the participants' responses. The remaining participants' responses were then coded by 1 individual, which was followed by the recoding of 20% of the participants' data by a different researcher to determine a percentage match between the final 2 coders (93.5% match). Coding data were then analyzed by calculating the number of effective instructional or behavioral strategies mentioned in each response and calculating a percent total. The 2-week test-retest reliability for items related to knowledge application was r = 0.57.

Self-efficacy

Because self-efficacy is theoretically linked to behavior change,31,32 survey items assessed this construct using the text and video scenarios from the TBI knowledge application domain, asking how confident respondents felt in handling the described behavioral and instructional situations (ie, applied self-efficacy). A 6-point Likert scale was used with the text and video scenarios, such that 1 = not at all confident and 6 = completely confident. The test-retest reliability for items related to applied self-efficacy was r = 0.86.

In addition, participants responded to 8 items related to general self-efficacy. Those items asked how effectively respondents felt that they could implement various instructional strategies for students with TBI. This general self-efficacy scale, adapted from a standardized instrument,33 used a 5-point Likert scale (1= not at all and 5 = extremely effective). The test-retest reliability in the pilot test for items related to general self-efficacy was r = 0.79. Reliability in the current study was also good (α = .89).

Program satisfaction

At posttest, treatment participants completed a 7-item questionnaire on program usability (ease of use, navigation) and satisfaction with the program (likeliness to recommend the program). Participants rated program usability statements on a 5-point Likert scale (where 1 = strongly disagree and 6 = strongly agree). We assessed program satisfaction by asking participants whether they would recommend the program to others. Participants assigned to the LEARNet condition did not complete the program satisfaction measure.

Procedures

In the Classroom training program

The In the Classroom training program offers specific strategies for managing TBI-related cognitive, behavioral, and social problems in the school setting. Because there is very little empirical evidence on educational interventions with students with TBI, the program incorporates evidence-based instructional strategies that have been validated with students with other disabilities who have with similar educational needs.34,35 The training was developed with input from educators from across the country, youth with TBI, family members, and state department of education partners. The interactive program has an empirically validated design that provides individually tailored video-based training36,37 using evidence-based instructional design principles.38,39 The program consists of 20 brief modules that incorporate validated instructional design components, including (a) application exercises with assessment and remediation loops to ensure comprehension, (b) interactive segments involving real-life scenarios that test user comprehension, and (c) sufficient practice and review to ensure content mastery. For example, in the module on managing behavior, the participants view a video example of a student becoming upset doing his math homework, and when the teacher intervenes, the student throws his book and threatens the teacher. The participants are asked to identify, from 4 possible responses, the most appropriate way for the teacher to respond.

Usual care: LEARNet

LEARNet (http://www.projectlearnet.org/) is a Web-based resource for teachers, clinicians, parents, and students developed by the Brain Injury Association of New York State. The site is browsable; users select content of most interest and are not required to view all sections of the site. The site uses a problem-solving approach and includes comprehensive video and text-based resources.

The key differences between the In the Classroom and LEARNet sites involve instructional design. LEARNet includes no application exercises or checks on content mastery. Practice and review are user-initiated, and all video examples are in lecture format rather than classroom scenarios depicting implementation of specific strategies. The In the Classroom site incorporates evidence-based instructional design principles38,39 with a focus on video modeling of validated instructional practices.

Experimental procedures

The evaluation was conducted between March 2017 and July 2017 in school district and university classrooms in Ohio and Oregon. Following determination of eligibility, the project coordinator used an online research randomizer (https://www.randomizer.org/) to randomly assign educators to treatment or usual care groups. In both conditions, the participants met as a group but they accessed the Web-based materials individually, using headphones and a personal Internet-connected device or one provided by the research team. Research assistants monitored participants to ensure that they continued working through the program during the training session(s). When participants arrived at the study location, they were granted access to either (1) the In the Classroom Web site (treatment group) or (2) the LEARNet Web site (usual care group). LEARNet is a Web site developed by the Brain Injury Association of New York for educators who work with students with TBI in educational settings.

At each study location, the educators completed the pretest, accessed either the In the Classroom or LEARNet site, and completed the posttest in 1 or 2 sittings (5-6 hours total time). Educators in the treatment group completed all 23 In the Classroom modules in sequential order. Participants in the LEARNet condition self-selected the viewed content. Follow-up assessments were sent out 60 days after posttest completion. All participants completed the training within 1 week. All data were collected electronically. Research assistants responsible for study logistics and data collection were not blind to study condition.

Data analysis

Before the main analyses, we screened outcomes for normality and outliers and examined baseline equivalency between conditions. Tests for efficacy of the intervention for continuous outcomes (TBI knowledge, TBI knowledge application, and self-efficacy) were specified with mixed-effects growth models that were fit using SAS PROC MIXED (version 9.2) and estimated with restricted maximum likelihood. Individual variability in outcomes from posttest to follow-up was modeled as a function of condition, adjusting for pretest outcome values, to ensure that any differences in pretest levels of the outcome did not bias estimates of intervention effects. Condition effects thus represent group differences in outcomes at posttest. We included a condition × time interaction (coded in months since posttest) to test whether the change in outcomes were significantly stronger at posttest versus 2-month follow-up or vice versa. Intercepts and slopes were specified as random effects. Effect sizes were derived by dividing the difference in model-implied means for each condition by the baseline pooled standard deviation, an effect size for growth models that is equivalent to Cohen d.40 The video-based knowledge application outcomes were counts of strategies identified, and therefore tests of efficacy were evaluated with Poisson regression models using SPSS (version 19). Separate models were run at posttest and follow-up, adjusting for pretest scores. Missing data were minimal (0% at baseline and posttest and 5% at follow-up), and maximum likelihood procedures used to estimate growth models for the continuous outcomes allowed for analyses of all participants. One imputed data set was generated to accommodate the 5% of missing video-based knowledge application follow-up data for the Poisson models. An unadjusted 2-tailed critical P value less than .05 was used to evaluate statistical significance for all tests.

RESULTS

Preliminary analyses

The continuous outcomes approximated normal distributions. One outlier was detected for self-efficacy at follow-up. Participants in the 2 conditions did not differ significantly in demographic characteristics or pretest measures of the outcomes (all P values > .063) indicting that randomization produced initially equivalent groups. Table 2 provides means and standard deviations for continuous outcomes, and Table 3 lists the number of strategies identified on the video-based knowledge application items.

TABLE 2 - Descriptive statistics for continuous study outcomes
Pretest Posttest Follow-up
Mean SD Mean SD Mean SD
TBI knowledge
Control 65.34 6.80 68.03 5.72 65.82 7.04
Intervention 67.74 5.92 77.47 6.08 71.79 8.52
TBI skill application
Control 71.13 11.07 72.13 11.10 70.05 12.36
Intervention 72.75 11.21 77.50 11.01 70.44 11.73
Applied self-efficacy
Control 4.72 0.58 5.12 0.55 4.73 0.99
Intervention 4.71 0.78 5.29 0.59 5.19 0.64
General self-efficacy
Control 4.10 0.39 4.26 0.43 4.17 0.33
Intervention 4.03 0.42 4.36 0.44 4.30 0.47
Abbreviation: TBI, traumatic brain injury.

TABLE 3 - Descriptive statistics for count study outcomesa
Number correct Control Intervention
Pretest Posttest Follow-up Pretest Posttest Follow-up
First video-based knowledge application item, %
0 14.3 12.0 22.4 6.0 2.0 17.4
1 44.9 40.0 40.8 38.0 20.4 28.3
2 26.5 44.0 28.6 30.0 30.6 23.9
3 14.3 12.0 8.0 22.0 28.6 26.1
4 0.0 12.0 0.0 4.0 16.3 4.3
5 0.0 0.0 0.0 0.0 2.0 0.0
Second video-based knowledge application item, %
0 44.0 26.0 26.5 26.0 6.0 17.0
1 48.0 48.0 65.3 54.0 36.0 63.8
2 8.0 26.0 8.2 20.0 52.0 17.0
3 0.0 0.0 0.0 0.0 6.0 2.1
aCells are the percentage of participants who identified the number of knowledge application items.

Efficacy of the intervention

On the posttest assessment, in the classroom (ITC) educators showed significantly greater gains in TBI knowledge (P < .0001, d = 1.36 [large effect]), TBI knowledge application (P = .0261, d = 0.46 [medium effect]), and general self-efficacy (P = .0106, d = 0.39 [small to medium effect]) than the LEARNet controls (see Table 4). The ITC educators showed greater trend-level applied self-efficacy posttest scores (P = .0152, d = 0.26 [small effect]) than the LEARNet controls. Over the follow-up period, the condition × time interaction was significant but negative for TBI knowledge (P = .0224, d = −0.54 [medium effect]), indicating that after a significant increase at posttest, the ITC educators showed significantly greater decreases relative to the LEARNet controls after the 2-month follow-up period. We estimated the model-implied least squares means at 2-month follow-up to evaluate whether the significant gains at posttest for ITC educators maintained through the end of the study period. Differences in least squares means showed that ITC educators, relative to LEARNet controls, maintained significant gains in TBI knowledge (P = .001, d = 0.82 [large effect]) and general self-efficacy (P = .018, d = 0.38 [small to medium effect]) but not in TBI knowledge application (P = .921, d = .02 [small effect]). The higher trend-level applied self-efficacy scores favoring ITC at posttest were significantly greater than the scores of the LEARNet controls at follow-up (P = .006, d = 0.66 [medium effect]). The general self-efficacy growth model was rerun excluding the 1 outlier, and the results were similar to the original analysis.

TABLE 4 - Results from growth models for continuous outcomesa
Variable Parameter β SE t P
TBI knowledge Intercept 68.431 0.799 85.61 <.0001
Pretest knowledge 0.339 0.083 4.06 <.0001
Condition 8.636 1.139 7.58 <.0001
Time −1.175 0.519 −2.26 .0259
Condition × Time −1.704 0.734 −2.32 .0224
TBI skill application Intercept 72.328 1.554 46.54 <.0001
Pretest skill application 0.250 0.079 3.15 .0022
Condition 4.969 2.200 2.26 .0261
Time −1.034 1.020 −1.01 .3131
Condition × Time −2.601 1.443 −1.80 .0746
Applied self-efficacy Intercept 5.294 0.065 81.62 <.0001
Pretest skill application 0.438 0.063 6.92 <.0001
Condition 0.181 0.091 1.97 .0512
Time −0.061 0.061 −1.00 .3184
Condition × Time 0.138 0.085 1.61 .1105
General self-efficacy Intercept 4.231 0.043 99.02 <.0001
Pretest skill application 0.705 0.061 11.57 <.0001
Condition 0.158 0.061 2.60 .0106
Time −0.037 0.027 −1.36 .1778
Condition × Time −0.001 0.038 −0.02 .9814
Abbreviations: SE, standard error; TBI, traumatic brain injury.
aCondition and Condition × Time parameters are a test of the efficacy of the in the classroom intervention. Condition parameters represent group differences in outcomes at posttest for the ITC group relative to the LEARNet group. Condition × Time parameters represent the difference in change in the outcomes from posttest to follow-up for the ITC group relative to the LEARNet group.

We detected no group differences at posttest or follow-up for the number of strategies identified from video-based knowledge application item 1 (see Table 5). However, at posttest, the ITC educators identified 46% more strategies than the LEARNet control educators for video-based knowledge application item 2, a significant difference (odds ratio = 1.46, 95% confidence interval = 1.01-2.10, P = .042). The gains at posttest for ITC educators for the second video item were not maintained after the 2-month follow-up period (odds ratio = 1.19, 95% confidence interval = 0.79-1.82, P = .407).

TABLE 5 - Results from Poisson regression models for count outcomesa
Posttest Follow-up
OR 95% CI P OR 95% CI P
First VBS application
Pretest VBS application 1.25 1.09-1.44 .001 1.35 1.15-1.59 <.001
Condition 1.13 0.86-1.49 .376 1.32 0.95-1.84 .100
Second VBS application
Pretest VBS application 1.30 1.01-1.69 .048 1.20 0.89-1.66 .212
Condition 1.46 1.01-2.10 .042 1.19 0.79-1.82 .407
Abbreviations: CI, confidence interval; OR, odds ratio; VBS, video-based situation.
aOdds ratios for the condition effects indicate the increase in log odds, for the in the classroom group relative to the LEARNet group, for identifying strategies from the video-based situation application. Condition effects are adjusted for group differences in pretest video-based situation application scores.

Program satisfaction

Posttest ratings showed the In the Classroom participants assigned to the treatment condition were satisfied with the program (LEARNet participants did not complete the program satisfaction measure). The average rating on the 6-point scale from “strongly disagree” to “strongly agree” for the 3 positively worded items (mean = 5.3, SD = 0.7) indicates that participants “agree” that the program is easy to use and can quickly be learned, and they feel confident using the program. The average rating for the 4 negatively worded items (mean = 1.8, SD = 0.9) indicates that participants “disagree” that the program is unnecessarily complex, need help using the program, the program is difficult to use, and they need to learn a lot before using the program. In addition, 100% of participants reported that they would recommend the program to others.

DISCUSSION

Results from this randomized controlled trial show that at the posttest assessment, educators who completed the In the Classroom training demonstrated significantly greater gains in TBI knowledge and the ability to apply that knowledge to real-world classroom scenarios than the LEARNet controls. This included 1 of 2 video-based scenarios, a response format that closely resembles the real-world decision-making skills required of classroom teachers. We also found significant differences in applied self-efficacy for working with students with TBI. Gains seen at posttest were maintained after the 2-month follow-up, except for the knowledge application measure.

These findings are especially promising because the sample consisted entirely of general education classroom teachers. Most students with TBI are primarily served in general education settings, where teachers rarely have knowledge or expertise in brain injury.23,41,42 Indeed, the average TBI knowledge score across all teachers in our sample was approximately 66%. This inadequate level of TBI knowledge among general educators likely contributes to weak service delivery and support for students with TBI. In fact, teachers' limited knowledge is associated with a lack of self-confidence in knowing how to effectively teach a student with a severe TBI.41 Knowledge gaps that affect service delivery and supports for students with TBI are critical for the field to address. Recent research evidence demonstrates how important it is for educators to adequately understand students with TBI.43 Traumatic brain injury–specific training could leave individual teachers better prepared to monitor students' postinjury problems and better equipped with strategies to deal with learning and behavioral challenges as they arise. Well-trained teachers would also be able to better understand parents' perspectives and communicate knowledgably about how best to address students' challenges.

The finding that knowledge application scores decreased over time was disappointing but not surprising. To increase maintenance, the program might benefit from online booster sessions. It might also be supplemented by hands-on, real-world experiences with students who have sustained TBIs. If a teacher has limited opportunity to apply new ideas from professional development to classroom instruction, improved student learning cannot be expected.44 Thus, the In the Classroom training could provide the foundation for more comprehensive professional development in TBI that is reinforced by practice and feedback in classroom settings.

Although recommendations for educational management of TBI in school settings consistently include training for educators,28,29 no intervention studies to date have examined how best to provide such training. In this study, we have taken an initial step to answer that question. The findings show that online training is effective in improving knowledge and self-efficacy and that those gains maintain over time. The next step in this line of research is to conduct additional studies looking at the real-world applications of this training. For example, what happens when this course is integrated within a broader professional development program in a school district? Do educators access the training on their own time, and if so, does the training result in gains in knowledge, knowledge application, and self-efficacy? How might this type of training be combined with evidence-based consultation and support to create a more robust model?45 And, most important, what are the effects of the training on outcomes among students with TBI?

Limitations

Although our results are promising, this study has several limitations. Its scope was limited to examining the effects of In the Classroom on educator knowledge and self-efficacy in implementing effective instructional and behavior management practices with students with TBI. Although changes in knowledge and self-efficacy have been theoretically linked to behavior change in the health behavior literature,31,32 the potential correlation of these measures with educator behavioral change is unclear. There are additional flaws with the measures utilized; the test-retest reliability for both the knowledge and knowledge application measures was low (0.64 and 0.57, respectively), and clinical meaningful change is unknown for these measures.

Unfortunately, our assessment protocol precluded a more in-depth examination of the range of variables that might affect TBI management practices in schools. For example, we did not include an assessment of how educators actually used the learned skills in their classrooms with students with TBI. A large body of research has shown that trainings such as In the Classroom, without hands-on practice and feedback in the instructional context, are unlikely to transfer to classroom practice.45 We have demonstrated that educators can effectively learn new strategies and apply that knowledge to text and video-based scenarios. However, additional practice and feedback/coaching will be required to ensure that those gains transfer to the classroom.

The sample was also a limitation of this study. Because most students with TBI are served in general education classrooms,42 we chose to evaluate the training with only classroom teachers and did not include special education teachers or other educators (eg, school psychologists, speech/language pathologists). Thus, we were unable to compare the differences in gains between groups of educators. We also acknowledge a significant lack of ethnic diversity in the study population. Differences in study outcomes by minority status are important, but because our sample included primarily white women, we lack the statistical power to adequately address differential effects as a function of minority and gender status. This limits the generalizability of the findings. Third, there is a small chance that during the 2-month follow-up period, participants may have sought out training in TBI (perhaps via online sources or independent study), thus compromising the follow-up data. Finally, because the follow-up assessment occurred at 2 months, we have no information about the long-term maintenance of the gains educators demonstrated in TBI knowledge and self-efficacy. Future evaluation efforts could include a longer follow-up assessment.

CONCLUSION

Interactive online instruction is an effective vehicle for delivering educator training and is increasingly being used for professional development.46–48 The US Department of Education's 2009 meta-analysis of online learning reviewed more than 1000 empirical studies and found that, on average, students using online learning environments outperformed their counterparts who received face-to-face instruction.38 Studies examining the efficacy of Web-based training found increases in knowledge, skills, and participant satisfaction and engagement.49,50

The current climate of school reform emphasizes, and often requires, professional development51 but that creates financial and time burdens for already strained educational systems.52 Therefore, schools need affordable tools that promote educator knowledge and practices, build educator capacity, and respect educator time constraints. Online professional development is efficient and can overcome the barriers of time, cost, scheduling, and travel.50,53

Given the prevalence of TBI in today's schools, it is important to develop evidence-based, cost-effective approaches to knowledge transfer and exchange in TBI professional development. In the Classroom is one such approach. Results from this randomized controlled trial demonstrate that when educators received the In the Classroom intervention, their rates of knowledge and self-efficacy in implementing effective strategies for students with TBI improved.

REFERENCES

1. Faul M, Xu L, Wald M, Coronado V. Traumatic Brain Injury in the United States: Emergency Department Visits, Hospitalizations and Deaths 2002–2006. Atlanta, GA: Centers for Disease Control and Prevention, National Center for Injury Prevention and Control; 2010.
2. Anderson V, Catroppa C, Morse S, Haritou F, Rosenfeld JV. Intellectual outcome from preschool traumatic brain injury: a 5-year prospective, longitudinal study. Pediatrics. 2009;124(6):e1064–e1071.
3. Catroppa C, Anderson V. Recovery in memory function, and its relationship to academic success, at 24 months following pediatric TBI. Child Neuropsychol. 2007;13(3):240–261.
4. Beauchamp M, Catroppa C, Godfrey C, Morse S, Rosenfeld JV, Anderson V. Selective changes in executive functioning ten years after severe childhood traumatic brain injury. Dev Neuropsychol. 2011;36(5):578–595.
5. Chapman LA, Wade SL, Walz NC, Taylor HG, Stancin T, Yeates KO. Clinically significant behavior problems during the initial 18 months following early childhood traumatic brain injury. Rehabil Psychol. 2010;55(1):48–57.
6. Ganesalingam K, Yeates KO, Taylor HG, Walz NC, Stancin T, Wade S. Executive functions and social competence in young children 6 months following traumatic brain injury. Neuropsychology. 2011;25(4):466–476.
7. Gerrard-Morris A, Taylor HG, Yeates KO, et al. Cognitive development after traumatic brain injury in young children. J Int Neuropsychol Soc. 2010;16(1):157–168.
8. Kurowski BG, Taylor HG, Yeates KO, Walz NC, Stancin T, Wade SL. Caregiver ratings of long-term executive dysfunction and attention problems after early childhood traumatic brain injury: family functioning is important. PMR. 2011;3(9):836–845.
9. Yeates KO, Armstrong K, Janusz J, et al. Long-term attention problems in children with traumatic brain injury. J Am Acad Child Adolesc Psychiatry. 2005;44(6):574–584.
10. Rivara FP, Vavilala MS, Durbin D, et al. Persistence of disability 24 to 36 months after pediatric traumatic brain injury: a cohort study. J Neurotrauma. 2012;29(15):2499–2504.
11. Eisenberg MA, Andrea J, Meehan W, Mannix R. Time interval between concussions and symptom duration. Pediatrics. 2013;132(1):8–17.
12. Rivara FP, Koepsell TD, Wang J, et al. Incidence of disability among children 12 months after traumatic brain injury. Am J Public Health. 2012;102(11):2074–2079.
13. Walz NC, Cecil KM, Wade SL, Michaud LJ. Late proton magnetic resonance spectroscopy following traumatic brain injury during early childhood: relationship with neurobehavioral outcomes. J Neurotrauma. 2008;25(2):94–103.
14. Sesma HW, Slomine BS, Ding R, McCarthy ML; Children's Health After Trauma (CHAT) Study Group. Executive functioning in the first year after pediatric traumatic brain injury. Pediatrics. 2008;121(6):e1686–e1695.
15. Centers for Disease Control and Prevention. Report to congress: the management of traumatic brain injury in children. https://www.cdc.gov/traumaticbraininjury/pubs/congress-childrentbi.html. Published 2018. Accessed April 2, 2018.
16. Haarbauer-Krupa J. Schools as TBI service providers. ASHA Leader. 2012;17(8):10–13.
17. Todis B, Glang A. Redefining success: results of a qualitative study of postsecondary transition outcomes for youth with traumatic brain injury. J Head Trauma Rehabil. 2008;23(4):252–263.
18. Davies S, Fox E, Glang A, Ettel D, Thomas C. Traumatic brain injury and teacher training: a gap in educator preparation. Phys Disabil. 2013;32(1):55–65.
19. Hooper SR. Myths and misconceptions about traumatic brain injury: endorsements by school psychologists. Exceptionality. 2006;14(3):171–182.
20. Davies SC, Ray AM. Traumatic brain injury: the efficacy of a half-day training for school psychologists. Contemp School Psychol. 2014;18(1):81–89.
21. Dreer LE, Elliott TR, Shewchuk R, Berry JW, Rivera P. Family caregivers of persons with spinal cord injury: predicting caregivers at risk for probable depression. Rehabil Psychol. 2007;52(3):351–357.
22. Ernst WJ, Gallo AB, Sellers AL, et al. Knowledge of traumatic brain injury among educators. Exceptionality. 2016;24(2):123–136.
23. Ettel D, Glang AE, Todis B, Davies SC. Traumatic brain injury: persistent misconceptions and knowledge gaps among educators. Exceptionality Educ Int. 2016;26,(1):1–18.
24. Linden MA, Braiden H-J, Miller S. Educational professionals' understanding of childhood traumatic brain injury. Brain Inj. 2013;27(1):92–102.
25. Graff DM, Caperell KS. Concussion management in the classroom. J Child Neurol. 2016;31(14):1569–1574.
26. Evans K, Hux K, Chleboun S, Goeken T, Deuel-Schram C. Persistence of brain injury misconceptions among speech language pathology graduate students. Contemp Iss Comm Sci Disord. 2009;36:166–173.
27. Glang A, Ettel D, Todis B, et al. Services and supports for students with traumatic brain injury: survey of state educational agencies. Exceptionality. 2015;23(4):211–224.
28. Dettmer J, Ettel D, Glang A, McAvoy K. Building statewide infrastructure for effective educational services for students with TBI: promising practices and recommendations. J Head Trauma Rehabil. 2014;29(3):224–232.
29. Gioia GA, Glang A, Hooper S, Eagan Brown B. Building statewide infrastructure for the academic support of students with mild traumatic brain injury. J Head Trauma Rehabil. 2016;31(6):397–406.
30. Halstead ME, McAvoy K, Devore CD, Carl R, Lee M, Logan K. Returning to learning following a concussion. Pediatrics. 2013;132(5):948–957.
31. Ajzen I. The theory of planned behavior. Organ Behav Human Decis Process. 1991;50(2):179–211.
32. Ajzen I, Joyce N, Sheikh S, Cote NG. Knowledge and the prediction of behavior: the role of information accuracy in the theory of planned behavior. Basic Appl Soc Psychol. 2011;33(2):101–117.
33. Tschannen-Moran M, Hoy AW. Teacher efficacy: capturing an elusive construct. Teach Teacher Educ. 2001;17(7):783–805.
34. Glang A, Ylvisaker M, Stein M, Ehlhardt L, Todis B, Tyler J. Validated instructional practices: application to students with traumatic brain injury. J Head Trauma Rehabil. 2008;23(4):243–251.
35. Ylvisaker M, Adelson PD, Braga LW, et al. Rehabilitation and ongoing support after pediatric TBI: twenty years of progress. J Head Trauma Rehabil. 2005;20(1):95–109.
36. Cook J. Cooperative problem-seeking dialogues in learning. Paper presented at: Fifth International Conference on Intelligent Tutoring Systems; June 19–23, 2000; Montreal, Canada.
37. Mitchem K, Koury K, Fitzgerald G, et al. The effects of instructional implementation on learning with interactive multimedia case-based instruction. Teacher Educ Special Educ. 2009;32(4):297–318.
38. US Department of Education Office of Planning Evaluation and Policy Development. Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies. Washington, DC: US Department of Education; 2010.
39. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Instructional design variations in internet-based learning for health professions education: a systematic review and meta-analysis. Acad Med. 2010;85(5):909–922. doi:10.1097/ACM.1090b1013e3181d1096c1319.
40. Feingold A. Effect sizes for growth-modeling analysis for controlled clinical trials in the same metric as for classical analysis. Psychol Methods. 2009;14(1):43.
41. Mohr JD, Bullock LM. Traumatic brain injury: perspectives from educational professionals. Prev Sch Fail Altern Edu Child Youth. 2005;49(4):53–57.
42. US Department of Education National Center for Education Statistics. The digest of education statistics, 2015 (NCES 2016-014), Table 204.60. https://nces.ed.gov/fastfacts/display.asp?id=59. Published 2016. Accessed March 7, 2018.
43. Todis B, McCart M, Glang A. Hospital to school transition following traumatic brain injury: a qualitative longitudinal study. Neurorehabilitation. 2018;42(3):269–276. doi:10.3233/NRE-172383.
44. Yoon KS, Duncan T, Lee SWY, Scarloss B, Shapley K. Reviewing the evidence on how teacher professional development affects student achievement (issues & answers report, REL 2007, no. 033). http://ies.ed.gov/ncee/edlabs. Published 2007. Accessed March 7, 2018.
45. Glang A, Todis B, Sublette P, Eagan-Brown B, Vaccaro M. Professional development in TBI for educators: the importance of context. J Head Trauma Rehabil. 2010;25(6):426–432.
46. Masters J, de Kramer RM, O'Dwyer LM, Dash S, Russell M. The effects of online professional development on fourth grade English language arts teachers' knowledge and instructional practices. J Educ Comput Res. 2010;43(3):355–375.
47. Meyen EL, Yang CH. Online staff development for teachers: multi-state planning for implementation. J Special Educ Technol. 2005;20(1):41–54.
48. Moon J, Passmore C, Reiser BJ, Michaels S. Beyond comparisons of online versus face-to-face PD: commentary in response to Fishman et al., “Comparing the impact of online and face-to-face professional development in the context of curriculum implementation.” J Teacher Educ. 2014;65(2):172–176.
49. De La Paz S, Hernández-Ramos P, Barron L. Multimedia environments in mathematics teacher education: preparing regular and special educators for inclusive classrooms. J Technol Teacher Educ. 2004;12(4):561–575.
50. Fisher JB, Schumaker JB, Culbertson J, Deshler DD. Effects of a computerized professional development program on teacher and student outcomes. J Teacher Educ. 2010;61(4):302–312.
51. Every Student Succeeds Act, S 1177, 114th Cong, 1st Sess (2015). Washington, DC: United States Congress.
52. Dede C, Ketelhut DJ, Whitehouse P, Breit L, McCloskey EM. A research agenda for online teacher professional development. J Teacher Educ. 2009;60(1):8–19.
53. Reeves T, Pedulla J. Bolstering the impact of online professional development. J Educ Res Policy Studies. 2013;13(1):50–66.
Keywords:

brain injury; education; randomized controlled trial; schools; staff development; training program; Web-based

© 2018 Wolters Kluwer Health, Inc. All rights reserved.