Tapping the expertise of crowds is a well-established business practice for uncovering innovation (4,20). Crowdsourcing is a web-based problem-solving model that seeks innovative solutions from a distributed network of individuals through an open call for proposals (9,20). This approach has many advantages, including speed, efficiency, low cost, and potential for unearthing novel ideas highly relevant to the intended audience (5,17). Although the use of this approach in the biomedical (12) and health (2,8,13,21) fields is increasing, crowdsourcing remains a novel approach in public health (5). A 2014 systematic review (18) identified 21 applications of crowdsourcing in health research, with three in public health, specifically; however, none involved the generation or identification of public health intervention strategies or physical activity (PA) programs.
Conventional thinking around dissemination of evidence-based intervention models suggests that evidence should flow from 1) rigorously designed research studies, to 2) publication in peer-reviewed journals and inclusion in systematic reviews, to 3) development of evidence-based guidelines and, finally, to 4) standard public health practice. However, this process moves slowly and much of the evidence is lost along the way (e.g., through failure to be accepted in peer reviewed journals or exclusion from systematic reviews) (7). When they are finally disseminated, intervention models originally conceptualized and tested in controlled settings may not be viable in resource-constrained practice environments. By comparison, crowdsourced intervention models offer potential for quicker dissemination because they often already exist in a form that can be evaluated and scaled and the crowdsourcing process is faster than the typical academic research and publication timeline. Given their origins in practice settings, crowdsourced programs may also be more amenable to the realities of real-world practice.
Childhood obesity remains one of the most complex, wide-reaching, and urgent public health concerns in the United States (16) and in an increasing number of countries worldwide (15). ChildObesity180 at Tufts University (6) blends scientific evidence and rigor with innovation and experience from the business sector to develop, implement, evaluate, and scale up high-impact childhood obesity prevention initiatives, with a focus on reaching those children disproportionately impacted by obesity. An initiative of ChildObesity180, the Active Schools Acceleration Project (ASAP) (1) is focused on increasing quality PA in schools, because low levels of PA may contribute to the childhood obesity epidemic (10). To our knowledge, ASAP is the first to use a crowdsourcing strategy to identify promising intervention models to promote children's PA and health. Similar to positive deviance (19), which takes an asset-based approach to identifying individuals who have thrived in a challenging environment (14), this crowdsourcing strategy aimed to uncover examples of programs that were being successfully implemented in schools despite resource and time constraints and competing educational priorities.
The purpose of this manuscript is to present a case example of using crowdsourcing to surface, select, disseminate, and scale up innovations in school-based PA promotion. Although this case focuses on PA promotion specifically, lessons learned are transferable to other public health domains.
ChildObesity180 engaged its senior leadership and charter members, a strategic advisory board of 17 multisector CEO-level leaders, to design a strategy for increasing PA among children. The team reviewed the research literature and recommendations and engaged in two in-person brainstorming sessions in September 2010 and January 2011; a subgroup advanced the work between meetings. Out of this process grew a desire to capitalize on existing “bright spots” in school-based PA while maintaining an explicit focus on feasibility, timely impact, and potential to overcome documented barriers. Incorporating feedback and ideas from this process, ASAP leadership designed a four-phased approach to uncovering and scaling up school-based PA programs: surfacing candidates, selecting promising programs, disseminating the innovations, and driving national scale (Fig. 1). The following sections describe these phases in more detail, highlighting components that may be most instructive for other practitioners interested in crowdsourcing. The key lessons learned are described more fully in the Discussion section. Human subject procedures were determined to be exempt by the Tufts University Institutional Review Board.
A 2013 Harvard Business Review article (3) noted that contests are a particularly useful crowdsourcing strategy for solving complex or novel challenges or problems for which no established best-practice solutions exist. In ASAP, a nationwide contest (the “innovation competition”) was held from February 7 to April 2, 2012, and included two categories: school programs and technological innovations. To elicit widespread response, the competition offered high-value prize opportunities. The school competition included two national winners ($100,000 each) and up to 10 regional winners ($25,000 each), and the technology competition included two national winners ($50,000 each). This article focuses on the school competition, which was developed with the intent of identifying program models that could later be disseminated nationally in partnership with ASAP. The technology category was open to developers, entrepreneurs, and inventors with innovative, promising technologies for increasing quality PA in schools. The primary intent of the technology competition was to provide funding that would support winners in expanding their own work, but the winners were not considered for national dissemination partnership.
The structure of the school competition, with at least one prize guaranteed for each region, aligned with the goal to elicit applications from communities nationwide. The competition was specifically designed to surface examples of programs that had already been implemented rather than program ideas/concepts. This focus enabled the evaluation process to include assessment of effectiveness in real-world settings. Furthermore, established programs were likely to have materials/resources in place, maximizing the speed with which they could be disseminated. Reflecting ChildObesity180's mission to reverse the obesity trend among 5- to 12-yr-old children, schools serving children in grades K–6 were eligible to apply. Applications were required to be submitted by teams of two to six people (including teachers, administrators, parents, coaches, students, and other educators) representing a school or district; this team-based model was designed to encourage collaboration among individuals involved in program implementation. Applicants were also required to obtain signatures of support from a principal or superintendent to ensure administrative support of competition participation, which could result in the scaling of the program model originating in a given school.
The innovation competition application included questions about the applicant school or district (e.g., school location, student enrollment, and demographics) and seven short-answer questions regarding the following dimensions of the school's PA program: originality; fun/child engagement; capacity to promote health; cost-effectiveness/sustainability; scalability; potential reach to all types of children, regardless of age, fitness, or athletic ability; and capacity to instill transferable skills/habits/attitudes. Responses to each item were limited to 500 words. Applicants were also encouraged to submit a short video or other supporting multimedia materials (e.g., photos, PowerPoint presentations, website URLs, program materials, and news articles). Evaluation criteria (Fig. 2) were informed by other competition models that targeted a similar audience, with the final weighted scoring criteria determined by researchers at Tufts University.
Several strategies were used to generate awareness of the competition and encourage schools nationwide to apply. Recognizing the potential for influential thought leaders to motivate action, ASAP pursued advocates interested in children's PA to support various aspects of the competition (e.g., promote the competition via social media networks, participate as a competition judge, or serve as a speaker at a culminating award ceremony). A roster of over 30 potential supporters was assembled from professional network contacts and public figures who had supported similar causes in the past, as identified through web search engines. Customized e-mail invitations were sent to these individuals, with each highlighting a specific request for support. Those who agreed to support the initiative included First Lady Michelle Obama and other high-profile individuals, including entrepreneurs, technology experts, and public figures. The support of these champions lent credibility, visibility, and expertise to the competition. A centerpiece of the promotional efforts was a video featuring First Lady Michelle Obama announcing the ASAP competition. She had launched her Let's Move! initiative in 2010 to address the childhood obesity epidemic and therefore had a strong interest in broadscale PA promotion. The video was posted on YouTube (see Video, Supplemental Content 1, http://links.lww.com/TJACSM/A0) and linked to the ASAP website in January 2012.
An educational marketing firm and a public relations agency were engaged to help generate nationwide awareness about the competition. In February 2012, a press release was issued featuring the YouTube video and a comprehensive social media strategy was launched, including regular (3–4 wk−1) Facebook and Twitter posts. The educational marketing firm delivered promotional e-postcards to two cohorts. The first cohort of school district-level leaders included directors of health and physical education, curriculum directors, superintendents, and before-/after-school program directors; this group received information about the competition twice, approximately 2 wk apart, in February 2012. The second cohort, contacted once in March 2012, included building-level school employees such as principals, physical education department chairpersons, and physical education teachers. In addition to the general information about the competition, this cohort also received information about a webinar that was being offered in March 2012 to answer questions from potential applicants. The competition and webinar were also promoted by national partner organizations in the education sector (including Alliance for a Healthier Generation; National Association for Sport and Physical Education; Fuel Up to Play 60; and President's Council on Fitness, Sports, and Nutrition) through Listservs and other communication channels.
Several process and outcome evaluation methods were used to evaluate the competition. To determine the reach of the first lady's promotional video, the number of YouTube views was monitored. The size of the e-postcard distribution lists, along with the message open rates and click-through rates, was also monitored. The effectiveness of these communications was evaluated on the basis of the total number of applications received. To determine the breadth of the competition's reach nationally, each application was categorized by region (Atlantic, Central/Mountain, Midwest, Southeast, New England, New York/New Jersey, Southwest, West) and the proportion of applications from each region was compared with the distribution of all US elementary and middle schools. To evaluate variety in the types of programs represented, applications were categorized by time of school day (before, during, or after school, or multiple), and the percentage of schools within each category was calculated.
Selecting Promising Programs
Innovation competition applications were judged through two rounds. In both rounds, judges used a seven-item scoring sheet (Fig. 2), matching the seven short-answer items included in the application. Each item was weighted 10% or 20%, depending on the importance for potential dissemination. Judges scored each item 0–5. In round 1, a total of 19 individuals (including ASAP staff and Tufts University graduate students) scored the applications, with two individuals assigned to each. All judges were provided with an evaluation guide that outlined the scoring criteria; round 1 judges were also engaged in an in-person orientation session during which two to three entries were scored as a group and judges were encouraged to share their rationale for scoring decisions to calibrate and therefore increase the consistency of scoring practices. Weighted scores were summed to determine an overall score. In cases where the two scores differed by a large margin, the application was rereviewed by a third judge and the average of the three scores determined the final score. On the basis of these scores, 75 applications advanced from round 1 to round 2, which included a panel of 32 expert judges identified and recruited through the professional network of ChildObesity180's multisector leaders. These judges included individuals from academia and government agencies, school officials and educators, healthcare professionals, parents, technology experts, and leaders from the private sector and child advocacy groups. Each application was judged by four experts, and the four scores were averaged to determine the overall score. The four highest-scoring applications from each geographic region were advanced as finalists. A panel of four ASAP staff members then reviewed all finalist applications and the expert scores and determined by consensus the top school from each region plus one additional high-scoring school. The two strongest applications, determined by consensus, were named national winners, and the other seven were regional winners. The panel also chose 10 schools to be honorable mention award recipients ($2500 each). In May 2012, ASAP staff conducted site visits with the national and regional winners to confirm that the programs were being implemented as described in the applications. In June 2012, ASAP honored the winning schools at an awards celebration in Washington, DC. In addition to recognizing winners, the event created opportunities for networking and exchanging insights and also helped build a national network that could support future efforts in program dissemination and scale. The event also drew media interest for the winning programs, raising their visibility.
From the nine winning programs, ASAP staff selected three for national dissemination. The programs were chosen by consensus among ASAP staff members on the basis of readiness for scale; the key factors considered included cost, availability of preexisting program guides/materials, and demonstrated history of expansion (all three programs had already expanded to multiple schools: nearly 200 schools for program 1, 15 for program 2, and 115 for program 3). The programs' cost effectiveness was a particularly important factor given ASAP's ultimate goal to disproportionately reach children at high risk for obesity, including low-income children. Program 1 (henceforth the “Before-School Program”) was a structured, before-school PA program; program 2 (henceforth the “In-Class Program”) was an in-class activity break program; and program 3 (henceforth the “Running and Walking Program”) was a running and walking club program. These three programs also represented diverse options: they required different types of spaces, financial resources, and staffing and together offered options for implementation at multiple times in the school day.
Before the programs were disseminated, an expert consultant was commissioned to lead an evaluation of the three programs. This included analysis of online information and other materials provided by the programs as well as observations of a convenience sample of six schools (one school with the Before-School Program, two schools with the In-Class Program, and three schools with the Running and Walking Program). The number of observations conducted was limited by time/resource constraints and school availability. These observations occurred during regular school hours and included unstructured interviews with principals, teachers, and volunteers that worked with each program, as well as monitoring of PA levels using a modified version of the System for Observing Fitness Instruction Time tool. For these observations, trained data collectors, guided by prerecorded audio cues, completed 10 s of observation followed by 10 s of recording activity intensity (sedentary, low, moderate, or vigorous). Before each session, the expert consultant selected a sample of children to be observed; subjects were selected to maximize diversity in terms of sex, race/ethnicity, and body type within the sample. Each subject was observed for approximately 10 min. Recording sheets included program name, child sex, and child grade. The average percentage of time in moderate or vigorous activity was calculated for each program. These results, along with other findings from interviews and analyses of program materials, were summarized in a report by the expert consultant, along with recommendations for potential modifications to the programs to maximize time in moderate-to-vigorous PA (MVPA). These reports were shared with leaders from the three programs, who were encouraged to update materials accordingly.
Disseminating the Innovations
In spring 2013, ASAP launched a microgrant campaign, which gave schools across the country the opportunity to apply for a $1000 microgrant to support implementation of one of the three programs. In addition to supporting adoption of new PA programs in schools, this campaign afforded the opportunity to pilot test the three models to better understand their potential for national scale. A microgrant strategy was selected to award a small one-time injection of funding that could inspire schools to launch a new PA program. Promotion of the call for applications was supported by an educational marketing firm and comprised a variety of tactics, including direct outreach to key districts (e.g., phone calls or personalized e-mails); placement in grant announcement Listservs and websites; a variety of website, Twitter, Facebook, and newsletter posts from education-related organizations in multiple states; and announcements on the several social media accounts/websites of the Association for Health, Physical Education, Recreation and Dance, the professional association for school-based health and physical education teachers. A press release was issued by ChildObesity180, and the microgrants were mentioned in an Associated Press article about the launch of Michelle Obama's Let's Move! Active Schools campaign. The microgrants were also highlighted on the Let's Move! Active Schools website as one of the two sources of funding available to support PA programming in schools.
The microgrant application was completed online and included 40 multiple-choice and short-answer questions designed to gather information on the grant applicant's role (e.g., PE teacher, classroom teacher, or parent), school demographics, and the applicant's plans for implementing a program if awarded a microgrant. Three open-ended questions designed to assess schools' readiness to implement a new program drove the scoring process (Fig. 3); each of these dimensions was scored on a scale from 0 (“insufficient information”) to 5 (“strong”), with the sum of the individual scores producing the overall application score. Given available funding, it was determined that approximately 1000 schools with the highest scores would win microgrants. Schools received $500 at the beginning of the school year (September 2013) and the remaining $500 in February 2014. To aid in implementation, each grant recipient received a detailed guide specific to their chosen program as well as a slide deck that could be used to be explain the program to others in the school. ASAP also partnered with the In-Class Program and the Running and Walking Program to develop video content to support program implementation; ASAP posted this material online and disseminated links to grant recipients via e-mail. The Before-School Program already had substantive video content available and distributed that content to grantees directly.
Three electronic surveys were administered to school champions during the program year: a preimplementation survey in October 2013, a midcourse survey in December 2013, and a postimplementation survey in May 2014. Questions included items about the program's reach (e.g., percentage of students participating); amount of activity added (e.g., whether the program added a new opportunity or replaced an existing one; average days per week and minutes per session); and perceptions of the program among students, teachers, and parents. For this article, responses to the postimplementation survey only were analyzed because these responses reflect the champions' perspective on the full program year. For all grantee schools and subgroups stratified by the program (i.e., Before-School Program schools, In-Class Program schools, and Running and Walking Program schools), the following were calculated: percentage of schools in which the ASAP program was a new opportunity (not replacing another program); mean minutes per week that the program was implemented; percentage of schools in which >50% of the school's students participated in the ASAP program; percentage of respondents indicating (a) that they planned to reimplement the program the following year and (b) that they would recommend the program to another school; and the percentage of respondents indicating that the response to the program was “all positive” or “mostly positive” among (a) students, (b) parents, and (c) school staff/administrators.
A third-party organization provided data on grantee school characteristics, including student race/ethnicity, free/reduced-price lunch eligibility, and urbanicity, gathered from publicly available data sources, such as the National Center for Education Statistics. For all grantee schools and program-stratified subgroups, the following were calculated: average total school enrollment; percentage of schools where ≥50% of students were nonwhite; percentage of schools where ≥50% of students were eligible for free/reduced-price lunch; and percentage of schools in urban, suburban, town, and rural communities.
Driving National Scale
In the microgrant campaign, the Running and Walking Program was, by a sizeable margin, the program for which the greatest proportion of schools applied (see Results). Schools cited flexibility, accessibility, ease of implementation, and appeal to kids of all ages as compelling features of the Running and Walking Program. Therefore, in the scaling up phase, ASAP launched the New Balance Foundation Billion Mile Race (www.BillionMileRace.org), which challenges children nationwide to collectively walk, jog, run, or wheel 1 billion miles at school. Upon registering with Billion Mile Race, participating schools activate a custom school profile page housed on the initiative website. Registrants enter miles as they are logged by students, and those miles populate both the school's own profile page as well as the billion mile national ticker. Schools that do not have an existing walking/running program, or do have one but are interested in obtaining more information about alternative options, are provided information about the Running and Walking Program. To date, the campaign has been promoted mainly through mass e-mail communications and professional conferences targeting educators. The number of unique registrations and total number of miles logged are tracked on an ongoing basis.
Surfacing Candidates and Selecting Promising Programs
Process evaluation data for the surfacing stage show that the innovation competition marketing efforts reached thousands of educators. In phase 1 of the e-postcard campaign, messages were sent to 10,989 unique district-level leaders; 17.3% of e-mails were opened, with a click-through rate of 1.2%. In phase 2, 16,055 e-mails were sent to building-level professionals, with an open rate of 8.7% and a click-through rate of 1.0%. As of April 2012, when the innovation competition closed, the YouTube video featuring the first lady had been viewed 6630 times.
The competition elicited a total of 427 applications for the school program category, along with 88 applications for technology innovations. The school program applicant pool represented all 50 states, and the regional distribution of applications was similar to the distribution of all US schools (Fig. 4). Programs represented in the applications occurred at a variety of times throughout the school day, with 55% of programs taking place only during normal school hours; 19% occurring at multiple times throughout the day (i.e., occurring during more than one of the following times: before, during, or after school); 16% occurring after school; and 10% taking place before school.
The number of students observed and total observation time during the in-school modified System for Observing Fitness Instruction Time differed across programs: for the Before-School Program, 24 students were observed over a total of 220 min; for the In-Class Program, 27 students were observed over a total of 221 min; and for the Running and Walking Program, 49 students were observed over a total of 453 min. The mean proportion of time students engaged in MVPA was 35% for the Before-School Program, 16% for the In-Class Program, and 39% for the Running and Walking Program.
Disseminating the Interventions
In the dissemination phase, 1208 schools submitted microgrant applications. There were considerable differences in the popularity of the programs: 12% of applicants sought funds for the support of the Before-School Program, 21% for the In-Class Program, and 67% for the Running and Walking Program. In the 1002 schools that received grant funding, champions represented diverse roles: 48% were physical educators, 15% were administrators, 10% were classroom teachers, 7% were parents, and 20% held other roles. Among those champions, 71% (n = 710) responded to the postimplementation survey. Table 1 shows school enrollment/demographic data (from publicly available data sources) as well as data on program characteristics for 1) all ASAP grantees and 2) grantees stratified by program type. Notably, in 59% of schools, the majority of students were free/reduced-price lunch eligible, and, in 46% of grantee schools, the majority of students were nonwhite.
Driving National Scale
As of this writing, 4050 schools have registered in Billion Mile Race, including schools from all 50 states, and those schools have collectively logged 16.2 million miles. Further results are forthcoming.
This case study outlines a novel process for crowdsourcing innovative programs to promote PA in schools, answering the Institute of Medicine's call for bold action to make PA programs more accessible for children nationwide, particularly in schools (10,11). This process was influenced by business community practices and represented a substantial departure from the traditional pipeline for developing and disseminating evidence-based intervention strategies. One advantage of this process was speed: in less than 4 yr, ASAP surfaced over 400 programs that had been implemented in schools nationwide, identified three models that were particularly suited for scaling up, disseminated those programs to over 1000 schools nationwide, and used findings from that dissemination process to inform a broader plan for national scale.
Several key lessons learned emerge from our experience. One advantage of the innovation competition was its call for practice-based models rather than program concepts; this approach provided the opportunity to understand the programs' demonstrated (rather than theoretical) impact and to accelerate dissemination of the winning programs. The competition featured high-value awards, support from influential advocates, and broadscale publicity through multiple channels, all of which helped to maximize the number and diversity of applications to the competition. These components may be difficult to replicate in other environments. There may be potential in exploring similar, more modest approaches to address local or regional public health challenges; such approaches might feasibly engage local celebrities, thought leaders, and communications partners.
The criteria used to evaluate submitted innovations include not just proximal outcomes of interest (e.g., short-term increases in PA) but also other important characteristics that contribute to the overall impact and reproducibility at scale, such as cost-effectiveness, fun for children, and potential to impact behaviors over the life course. Determining those criteria a priori and making them available to competition applicants may have helped to maximize submissions' quality and alignment with initiative goals. In evaluating those submissions, input from diverse stakeholders representing not only the academic research community but also industry, healthcare, child education, and other sectors provided important insights around the programs' potential impact, feasibility, and appeal to schools. Future crowdsourcing campaigns in public health may likewise benefit from multisector input.
Similar to the innovation competition, the multichannel communications strategy used to promote the microgrants provided important visibility to help drive applications in the dissemination phase. Testing and evaluating multiple models in this dissemination phase provided valuable insights before transitioning to national scale. For example, the relative popularity of the three programs offered a helpful gauge of what might have the broadest appeal to schools nationwide. Furthermore, grantee surveys provided insight into the proportion of students reached by each program and typical duration of programming offered, whether the programs typically provided new PA opportunities or displaced others, and whether the programs were received positively by students and other stakeholders. Such insights are difficult to extract from controlled intervention trials, which generally have smaller, more geographically contained samples that may not fully reflect the realities of typical community-level practice.
This crowdsourcing campaign had several limitations. One ongoing challenge was balancing priorities related to methodological rigor against those related to cost and speed-to-market. For example, self-reported data collected from grantee schools may have been subject to bias, but validating those data using other methods (e.g., direct observation) would have been time and cost prohibitive. The three program models chosen for dissemination were selected on the basis of expert assessment of their potential. They were also evaluated through relatively time-efficient, low-cost direct observations, which informed recommendations delivered to program leaders. That said, these evaluation methods lacked the methodological rigor of other more resource-intensive methods for validating program effectiveness, such as randomized controlled trials. In addition, the results of the direct observations suggested that children spent less than half (16% to 39%) of the program time in MVPA; these observations informed recommendations for programmatic improvements, but the programs were not subsequently reevaluated. Additional rigorous evaluation approaches may have offered more conclusive evidence regarding program effectiveness but also would have substantially increased costs and slowed progress toward dissemination and scale. However, a randomized controlled trial funded by the National Institutes of Health is currently underway testing the In-Class Program and the Running and Walking Program; results of that research are forthcoming and may inform future program modifications. For other organizations seeking to use crowdsourcing to identify and scale innovations to address public health challenges, navigating the tension between rigor and speed/cost will likely remain a challenge.
This case study offers a novel framework for surfacing intervention models to increase PA, selecting promising programs, disseminating innovations, and driving national scale. Although this approach does not replace traditional scientific approaches, it does offer a compelling complementary option for more rapid scale. Future research should explore the potential of similar crowdsourcing approaches to address other public health challenges.
The authors wish to acknowledge First Lady Michelle Obama for her appearance in an internet video announcing the competition. The authors thank Shanti Sharma, Emilia Matthews, and Julie Gervis for their assistance with preparation and submission of the manuscript.
ChildObesity180 is funded by the Robert Wood Johnson Foundation and The JPB Foundation. Funding for the Active Schools Acceleration Project has been provided by the New Balance Foundation; the Cigna Foundation; a consortium of health plans including Blue Shield of California, HealthPartners, Inc., EmblemHealth, Humana Inc., Kaiser Permanente, Tufts Health Plan, The Regence Group, Wellpoint Foundation, Health Alliance Plan, Blue Cross Blue Shield of Florida Foundation, Horizon BCBS of NJ, and Blue Cross and Blue Shield of North Carolina; the Winebaum Family Trust; and the Withings Corporation. The Active Schools Acceleration Project is a partner of the Let's Move! Active Schools initiative.
The authors report no conflicts of interest. The results of the present study do not constitute endorsement by the American College of Sports Medicine.