Skip to main content

Scale-up of the Internet-based Professional Learning to help teachers promote Activity in Youth (iPLAY) intervention: a hybrid type 3 implementation-effectiveness trial

Abstract

Background

Whole-of-school programs have demonstrated success in improving student physical activity levels, but few have progressed beyond efficacy testing to implementation at-scale. The purpose of our study was to evaluate the scale-up of the ‘Internet-based Professional Learning to help teachers promote Activity in Youth’ (iPLAY) intervention in primary schools using the RE-AIM framework.

Methods

We conducted a type 3 hybrid implementation-effectiveness study and collected data between April 2016 and June 2021, in New South Wales (NSW), Australia. RE-AIM was operationalised as: (i) Reach: Number and representativeness of students exposed to iPLAY; (ii) Effectiveness: Impact of iPLAY in a sub-sample of students (n = 5,959); (iii) Adoption: Number and representativeness of schools that received iPLAY; (iv) Implementation: Extent to which the three curricular and three non-curricular components of iPLAY were delivered as intended; (v) Maintenance: Extent to which iPLAY was sustained in schools. We conducted 43 semi-structured interviews with teachers (n = 14), leaders (n = 19), and principals (n = 10) from 18 schools (11 from urban and 7 from rural locations) to determine program maintenance.

Results

Reach: iPLAY reached ~ 31,000 students from a variety of socio-economic strata (35% of students were in the bottom quartile, almost half in the middle two quartiles, and 20% in the top quartile). Effectiveness: We observed small positive intervention effects for enjoyment of PE/sport (0.12 units, 95% CI: 0.05 to 0.20, d = 0.17), perceptions of need support from teachers (0.26 units, 95% CI: 0.16 to 0.53, d = 0.40), physical activity participation (0.28 units, 95% CI: 0.10 to 0.47, d = 0.14), and subjective well-being (0.82 units, 95% CI: 0.32 to 1.32, d = 0.12) at 24-months. Adoption: 115 schools received iPLAY. Implementation: Most schools implemented the curricular (59%) and non-curricular (55%) strategies as intended. Maintenance: Based on our qualitative data, changes in teacher practices and school culture resulting from iPLAY were sustained.

Conclusions

iPLAY had extensive reach and adoption in NSW primary schools. Most of the schools implemented iPLAY as intended and effectiveness data suggest the positive effects observed in our cluster RCT were sustained when the intervention was delivered at-scale.

Trial registration

ACTRN12621001132831.

Background

The benefits of physical activity for young people are extensive [1], but physical inactivity is a global public health problem [2]. The Global Matrix of Physical Activity Report Grades was first launched in 2014 to provide a better understanding of the levels of youth physical activity across the world [3]. The most recent Global Matrix, with data from 49 countries, revealed an average grade of ‘D’. This indicates that only 20% to 40% of children and adolescents are sufficiently active for optimal health [3]. Schools are ideal settings to address low levels of physical activity, as they provide access to large and diverse groups of children, and typically have the resources, personnel and facilities to promote physical activity [4].

Whole-of-school programs (also known as Comprehensive School Physical Activity Programs) [5] are considered by the International Society for Physical Activity and Health to be one of ‘eight investments that work for physical activity’ [6]. Whole-of-school programs engage school communities to provide young people with multiple opportunities to be active throughout the day, including quality physical education (PE), active classrooms, active recess, and lunch breaks, after school activities, and the promotion of active transportation to-and-from school. Whole-of-school physical activity interventions are considered the ‘gold standard’ for increasing physical activity in youth [7], but few have been ‘scaled up’ to achieve maximum population impact [8, 9].

Scaling-up to expand the reach of efficacious health promoting interventions under real-world conditions into broader policy or practice is important, but challenging [10]. Of note, voltage drop (i.e., reduction in effectiveness) [11] typically occurs as interventions progress from efficacy to effectiveness to implementation at-scale [12, 13]. For example, Lane and colleagues found that scaled-up physical activity interventions achieve on average, less than 60% of their pre-scale effect size [13].

The Supporting Children’s Outcomes using Rewards, Exercise and Skills (SCORES) program [14, 15] was a whole-of-school physical activity intervention targeting children in low-income communities in New South Wales (NSW), Australia. The intervention successfully increased children’s objectively measured physical activity, cardiorespiratory fitness, and fundamental movement skill competency. However, SCORES relied heavily on support from researchers, thus limiting scalability. Guided by the Consolidated Framework for Implementation Research [16], we modified SCORES so that it could be delivered using an online platform with minimal in-person support from external mentors (i.e., experienced teachers employed by the project). The adapted intervention is known as iPLAY (internet-based Professional Learning to help teachers promote Activity in Youth) [17].

We evaluated iPLAY in a cluster randomised controlled trial (RCT) in 22 primary schools in New South Wales (NSW), Australia [18]. At 12- and 24-months, students in the iPLAY group had greater increases in cardiorespiratory fitness, compared with students in the control group. We also observed significant intervention effects for objectively measured physical activity during school lunch and recess breaks at 12- and 24-months, but no effects for total physical activity or other secondary outcomes. The cost of the intervention per student was AUD33 (USD26). Implementation-effectiveness of iPLAY was examined concurrently with the cluster RCT. Therefore, the aim of our current study was to evaluate implementation of iPLAY at broad scale in primary schools across New South Wales (NSW), Australia using the Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM) framework [19]. Reach was considered the primary outcome in our implementation-effectiveness trial, as few school-based physical activity interventions progressed beyond smaller effectiveness to larger scale-up trials [8]. However, we acknowledge that all the components of the RE-AIM framework contribute to implementation success.

Methods

Study design

We conducted a type 3 hybrid implementation-effectiveness study [20] in primary schools in NSW. The primary focus of a type 3 trial is to evaluate implementation strategies, whilst also evaluating effectiveness of the intervention at the individual-level. As such, program reach (i.e., estimated number and representativeness of students exposed to iPLAY) was considered the primary outcome of the study. The implementation-effectiveness study ran concurrently with the cluster RCT [18] which involved 22 schools. Control schools from the cluster RCT were compared with the sub-sample of schools that provided effectiveness data for the implementation-effectiveness study. Figure 1 provides an illustration of participants' flow through the cluster RCT and implementation-effectiveness studies. Approval for this study was provided by the Australian Catholic University (2014185) and University of Newcastle (H-2016–0135) human research ethics committees and the NSW Department of Education (DoE)(SERAP2014260). Our trial adheres to the Standards for Reporting Implementation Studies (StaRI) Statement [21] and was retrospectively registered with the Australian and New Zealand Clinical Trials Registry (ACTRN12621001132831).

Fig. 1
figure 1

Adapted CONSORT flow diagram indicating participant flow throughout the study

Participants and randomisation

All government-funded primary schools in NSW were considered eligible to participate (N = 1,808). Schools were recruited via presentations at conferences and meetings (e.g., regional meetings of the NSW Primary Principals Association) and advertisements sent by the NSW DoE and the Australian Council for Health, Physical Education and Recreation. The iPLAY study was also advertised via the NSW DoE Twitter feeds and Facebook pages. Recruitment of schools was on-going from 2016 to 2020. We aimed to recruit a total of 180 schools (~ 10% of the total number of NSW government-funded primary schools). We recruited 147 schools, of which 22 were assigned to the cluster RCT and 115 to the implementation-effectiveness study (8 schools did not start the program and 2 schools were involved in a pilot study). We used a blocked randomisation process to ensure that schools in the RCT broadly represented government schools in NSW (see protocol paper for further details) [17]. Principals and teachers provided written consent for this study. Study participants and their parents/caregivers were provided with information statements. Opt-out parental consent was applied, whilst students provided oral assent.

Intervention

The iPLAY intervention has been described in detail in our protocol paper [17]. In summary, iPLAY is a whole-of-school physical activity intervention, that includes three curricular [(i) quality physical education [22], (ii) classroom energizers, and (iii) active homework], and three non-curricular [(iv) active playgrounds, (v) parental engagement, and (vi) community links] components. The program was designed to improve the primary outcome, cardiorespiratory fitness [23], by providing children with opportunities to participate in moderate-to-vigorous physical activity (MVPA) within and beyond the school setting.

Implementation strategies

Implementation strategies comprised: (i) professional learning for teachers, (ii) access to the iPLAY website, and (iii) provision of support from iPLAY mentors. Specifically, teachers were trained to deliver iPLAY via a combination of face-to-face and online learning modalities. The training involved a 2-h face-to-face workshop, four hours of online learning (8 × 30-min modules), a mentoring meeting, a peer observation, and a discussion at a staff meeting focused on iPLAY implementation. Completing these activities provided each classroom teacher with 14 h of professional learning that was registered with the NSW Education Standards Authority (the government education authority with the responsibility for the establishment and monitoring of quality teaching, learning, assessment, and school standards in NSW).

School principals also chose 1–3 teachers to be school leaders. These individuals received further online training and were responsible for the implementation of the non-curricular iPLAY components. iPLAY was the first whole-of-school physical activity intervention where most teacher training occurred online, an implementation strategy chosen to support the scalability of iPLAY and enhance sustainability. Online delivery allowed teachers to complete their learning at a time that suited them. It also allowed program content to be standardised, ostensibly to limit voltage drop and prevent program drift [11]. The iPLAY website provided teachers and leaders with their online training modules, and access to downloadable resources (e.g., lesson plans, activity descriptions, and classroom movement break videos).

Measures and outcomes

We used the RE-AIM framework [19] to guide the evaluation of iPLAY when implemented at-scale. The (RE-AIM) framework [19] was developed to address the slow translation of scientific knowledge into public health policy and practice [24]. The framework has been used extensively to guide scale-up of successful health promotion interventions [24] and allows researchers to assess internal and external validity. All dimensions of the RE-AIM framework are important, but we chose ‘Reach’ as our primary outcome because few school-based physical activity interventions have progressed beyond effectiveness trials. Data were collected at the individual (student, teacher, leader) and organisational (school principal) levels, via a combination of quantitative (questionnaires and website usage data) and qualitative (interviews with principals, leaders and teachers) methods. Table 1 provides a description of the different quantitative and qualitative methods used to assess the five RE-AIM dimensions. A summary of this information is provided below:

  • Reach was defined as the estimated number and representativeness of students who were exposed to iPLAY. We accessed the MySchool website to obtain student enrolment data for schools that received iPLAY.

  • Effectiveness was defined as the impact of the iPLAY program in a sub-sample of students from the implementation-effectiveness cohort (n = 5,315), who were compared with students from the control group in the cluster RCT (n = 643). The intervention designed to increase students’ physical activity within and beyond the school day. As such, we examined the impact of the intervention on students’ overall physical activity levels, active transportation to school, as well as their motivation and effort in PE and school sport. The intervention was guided by self-determination theory [12, 13], and we hypothesized that satisfying students’ basic psychological needs for competence, relatedness, and autonomy during PE and school sport would lead to improvements in well-being [14]. Students’ self-reported effort during PE/sport [25, 26], enjoyment during PE/sport [25], perceptions of needs support from teachers [27,28,29], typical physical activity participation [30], physical activity participation in the last week, organised sport participation (team and individual), active commuting to school [31], and subjective well-being (i.e., happiness and life satisfaction) [32].

  • Adoption was defined as the total number and representativeness of schools and teachers that participated in iPLAY (using data from the MySchool website), as well as the proportion of teachers and leaders who completed the iPLAY training modules (using data from the iPLAY website).

  • Implementation (fidelity) was defined as the extent to which the curricular and non-curricular components of the program were delivered as intended. Teachers were asked to self-report their implementation of the curricular and non-curricular components of the intervention using the iPLAY website. iPLAY mentors conducted observations of teachers’ PE lessons using the Supportive, Active, Autonomous, Fair and Enjoyable (SAAFE) framework [22]. Mentors assessed the quality of lesson delivery using 15 items aligned with the SAAFE principles (e.g., Teacher provided praise on student effort and improvement). Each item was scored on a 5-point Likert scale (1 = not at all true to 5 = very true) and the average of the 15 items was calculated and reported. Feedback from the lesson observations were uploaded into the iPLAY website by mentors.

  • Maintenance was operationalised as the extent to which curricular and non-curricular iPLAY components were maintained in schools. We conducted 43 semi-structured interviews with teachers (n = 14), leaders (n = 19), and principals (n = 10) from 18 schools (11 from urban and 7 from rural locations) to determine program maintenance (see Supplementary Table 1 for the interview guide). Nine of the schools were classified as ‘high adopters’ (67 to 100% of online modules completed by teachers), six schools were classified as ‘medium adopters’ (34 to 66% of online modules completed by teachers), and three schools as ‘low adopters’ (0 to 33% of online modules competed by teachers). Interviews were completed 18- to 24-months from baseline.

Table 1 Operationalisation of RE-AIM dimensions, data sources, outcome detail and analysis

Quantitative analysis

Descriptive statistics were used to calculate the primary outcome (Reach) using IBM SPSS version 28. Intervention effectiveness at the student level was determined by statisticians who were blinded to schools’ allocation using R version 3. We tested for between group differences in changes in students’ self-reported outcomes using mixed effects models with random effects for student, teacher, and school to account for clustering. Our analyses were consistent with the intention-to-treat principle [33] and included all participants randomized to treatment conditions, regardless of whether they completed follow-up assessments, using maximum likelihood to manage missing data. We compared students in the control group with those in the implementation-effectiveness trial. We ran mixed-effects models with a gaussian link function. We ran all models in R version 3 using Markov Chain Monte Carlo estimation. Results were considered not statistically significant if 95% CIs contained 0, and statistical tests were 2-tailed. We also report sub-group analyses for boys and girls, as we observed group-by-time interaction effects in our cluster RCT. Data were analysed from October to November 2021. Cohen’s d was calculated by dividing the mean difference in change by the standard deviation of change for each outcome.

Qualitative analysis

Four members of the research team conducted 43 semi-structured interviews with intervention teachers, leaders, and principals to determine program maintenance. Interviews lasted between 11 and 36 min (average = 20 min) and were audio recorded before being de-identified and transcribed verbatim. A qualitative content analysis [34] was carried out using NVivo 12 to organise and store the data. Initially, a phase of data immersion took place. Due to the close alignment with the RE-AIM framework and the purpose of analysis, a content analysis with an unconstrained categorisation matrix was applied. The qualitative dataset was coded for correspondence with the pre-identified categories, with additional inductively derived categories created to capture all salient content. Once the entire qualitative dataset was coded, descriptive summaries were developed with the aim of conveying the meaning of the overarching categories using sub-categories and participant quotes.

Results

Reach

iPLAY reached ~ 31,000 students (115 schools), representing approximately 6% of the total NSW primary school student population (~ 500,000) [35]. Just over 50% of students were female; 25% of all students were from language backgrounds other than English (Supplementary Table 2). Almost 10% of students were of Aboriginal or Torres Strait Islander descent. Students were from a variety of SES strata, as assessed by socio-educational advantage quartiles; over 35% of students were in the bottom quartile, almost half in the middle two quartiles, and 20% in the top quartile for SES.

Effectiveness

Characteristics and baseline values for outcome variables of students involved in the sub-sample to determine effectiveness are presented in Supplementary Table 3. This sub-sample (n = 5,959) included students from the implementation-effectiveness group (i.e., intervention group) and the control group (n = 643) from the cluster RCT. Students in the iPLAY implementation-effectiveness group reported improvements in a range of self-reported outcomes, compared with those in the control group (Table 2). Of note, there were small positive intervention effects for enjoyment of PE/sport (0.12 units, 95% CI: 0.05 to 0.20, d = 0.17), perceptions of need support from teachers (0.26 units, 95% CI: 0.16 to 0.53, d = 0.40), physical activity participation in the last week (0.28 units, 95% CI: 0.10 to 0.47, d = 0.14), and subjective well-being (0.82 units, 95% CI: 0.32 to 1.32, d = 0.12) at 24-months. Moderation effects by sex are reported in Supplementary Table 4.

Table 2 Effectiveness analyses for self-reported student outcomes

Adoption

A total of 115 schools were involved in the iPLAY implementation-effectiveness trial, which represents approximately 7% of all government-funded primary schools in NSW (Supplementary Table 5). Twenty schools dropped out of the study over the 4.5-year period. The implementation-effectiveness schools had a mean Index of Community Socio-educational Advantage (ISCEA) value of 990 and ranged from 732 to 1,182 (Australian median ISCEA value is 1,000). Most of the teachers were female, born in Australia; and ages ranged from 22–70 years (Supplementary Table 6).

Program adoption by school leaders is presented in Table 3. At 12-months, over 90% of leaders completed the five online learning modules, approximately 65% completed the four action plan meetings. At 24-months, the proportion of leaders completing online modules remained the same, whilst those completing action plans increased to 75%. Almost 90% of schools had at least one leader who completed all core learning, as per protocol, at both 12- and 24-months. Adoption of the intervention by teachers is also presented in Table 3. All teachers completed the mentor-facilitated professional learning workshop. At 12-months, almost 56% of teachers had completed all eight online learning modules, with this proportion increasing to 60% at 24-months. and 24-months.

Table 3 Intervention adoption rates by iPLAY leaders and teachers

Implementation (fidelity)

Implementation fidelity data for the curricular and non-curricular iPLAY components are presented in Table 4 and Supplementary Fig. 1. Implementation of curricular components at 24-months is summarized here. (i) Quality physical education: Almost 60% of schools reported meeting the required 150 min of PE/sport per week. Mentor-rated quality of these lessons was moderate, with over 90% of teachers rated > 3.0/5 on the SAAFE evaluation checklist (see Table 1 for additional details). (ii) Classroom energizers: Almost half of teachers (48%) reported incorporating at least 10 classroom energizers per week. (iii) Active homework: Three quarters of teachers (75%) reported that they included one active homework activity each week. Non-curricular components were as follows. (iv) Active playgrounds: over half of leaders (53%) reported implementing strategies related to active playgrounds; however, no schools achieved > 40% of break time in MVPA. (v) Parental engagement: Just over 40% of schools distributed a newsletter to parents, with under one-third (30%) holding parent information sessions. Half of schools held a physically active school fundraiser. (vi) Community links: Principals reported that Sporting Schools (i.e., federal government program) funding was used by almost all schools, although less than 10% of schools used this funding to have a teacher complete an accredited sports coaching program.

Table 4 Curricular and non-curricular intervention implementation fidelity

Maintenance

Findings from our interviews suggest being involved in iPLAY led to sustained changes in teacher practices and school culture. These changes meant that schools, leaders, and teachers placed a greater emphasis on whole school physical activity promotion, including quality PE and sport.

We had a huge culture of literacy, numeracy which was really nice, but there was no real focus on sports. So since this program we've definitely made sure we allocated the right amount of hours properly. We've designed programs around the iPLAY resources from the website, which was really good. They are accessible to all teachers … they can access them whenever they want. (Leader, Urban, High adoption)

Curricular features of iPLAY that teachers used in a sustained manner were; (i) reducing transition time, and (ii) maximising students’ opportunities to be active during lesson time. Similarly, teachers and leaders across all adoption levels revealed ongoing utilisation of active breaks and classroom energizers as a long-term legacy of their involvement in iPLAY.

… from a personal level, I completely changed the way I teach sport, completely changed it. So from my warm ups, to my modified games, to that release of control, having students not just in student-centred games, but having the kids designed the game, giving the kids the freedom to practise the skills how they choose to and having that real focus on increasing physical activity within the lesson … not having that seat and explain time and having that explain-as-they-play kind of structure … We do a brain break Go Noodle thing, dance or activity twice a day in my classroom and the kids love it. (Leader, Rural, High adoption)

Teachers also spoke favourably about sustaining the non-curricular elements of iPLAY following program delivery, such as changing the playground set-up to encourage varied physical activities in the school playground during recess and lunch times.

I think that before iPLAY, you walked around the playground and it was handball. Whereas now you're seeing lots of games and activities and group work and things like that out on the playground. (Teacher, Rural, Low adoption)

While many teachers spoke of immediate adoption and implementation of iPLAY curricular and non-curricular components, this was generally only a short-to-medium term change (i.e., 6 to 12 months). Enthusiasm waned once the iPLAY mentor presence and regular program engagement had ceased. For some schools, there was evidence that iPLAY resources were being used and sustainably integrated into school planning (high adoption schools), while for other schools this was less evident (low adoption schools).

I probably would say that it's not so much iPLAY, the program itself or using iPLAY as a way to describe what we're doing. I think that's fallen away, that's not happening. But a lot of what we implemented when we were on iPLAY, so, the active playground stuff, the accessing old resources on the website using them to program PE and sport programs. That's still happening, but because they completed the program two years ago, we've had a high staff turnover, but also, people just think that's been in place, ‘cause it's been in place for a long time so we don't refer it to iPLAY anymore. (Leader, Rural, High adoption)

Teachers from high and low adoption schools in rural and urban settings, highlighted the importance of iPLAY leaders and mentors to facilitate implementation of iPLAY. This was also evident in relation to the long-term maintenance of iPLAY, where leaders were seen as promoters of the program even after the study finished. Additionally, some teachers recommended maintaining longer-term connections with mentors to better sustain iPLAY program ideas.

The leaders are still driving it. They're still asking too. ‘Can we do a professional learning session on this? I think we should focus on this fundamental movement skill’. Or whatever it may be … Maybe a little bit more training for the teachers [is needed] - the school-based mentors, so that they could kind of keep it perpetuating. And maybe giving them the time to be able to do that, some professional development that they could continue to do with their colleagues. (Principal, Urban, Medium adoption)

In summary, there is evidence to suggest that iPLAY resulted in sustained changes in practices and culture. For many schools this included increased focus on PE and sport within programming, and the on-going implementation of non-curricular strategies such as classroom energiser breaks and provision of varied activities during recess and lunchtime periods. The provision of support (or lack thereof) from iPLAY leaders and mentors was identified as a key influence on long-term maintenance of iPLAY strategies.

Discussion

As noted in the Lancet Physical Activity Series, few school-based physical activity interventions have progressed beyond effectiveness testing to be implemented at-scale [8]. This creates what has been termed a ‘know-do-dissemination gap’– that our study aimed to fill [36]. Bridging this gap is vital, as scaling up effective interventions is the only means to enhance the health of children and adolescents at a population level. Our trial represents one of the largest and most comprehensive type 3 hybrid implementation-effectiveness studies of a whole-of-school physical activity intervention published to date [9, 37, 38]. The iPLAY program reach more than 30,000 students and implementation fidelity was relatively high and consistent with what was observed in our cluster RCT. These findings hold great promise given the significant barriers to implementing school-based health promoting programs at-scale [39]. The World Health Organization [40] has identified two key reasons why implementation research continues to be neglected. These comprise a basic lack of knowing what implementation research is and its importance to health, and insufficient research funds to conduct implementation and scale-up studies, as costs often fall outside the capacity of most granting agencies.

The reach of iPLAY across five years was substantial– ~ 31,000 students from 115 primary schools in NSW, Australia (7% of all government schools). By comparison, reach ranged from 210 [41] to 1,000,000 [42] students in a recent systematic review of 14 school-based physical activity dissemination studies [9]. This variability is largely due to the different methods used to calculate reach in dissemination studies [9]. For example, previous studies have used the number of teachers attending professional learning workshops [43, 44] and the ordering of program materials by teachers, to estimate reach into the student population [45]. We defined ‘reach’ as the estimated number, proportion and representativeness of students who were potentially exposed to iPLAY. We estimated ‘reach’ at a single time point using student enrolment data accessed from the publicly available MySchool website. Our approach provides a conservative estimate of iPLAY’s reach because it does not include new students who enrolled in intervention schools each year with the now-trained teachers.

Our unique hybrid effectiveness type 3 study design allowed us to examine effectiveness of iPLAY on health outcomes in a sub-sample of students, who were compared with students in the control group from the cluster RCT [18]. Intervention effects for self-reported physical activity, participation in team sports, enjoyment of PE, teacher psychological need satisfaction in PE, and subjective well-being were observed at 24-months. This compares favourably with our cluster RCT [18], where we observed improvements in cardiorespiratory fitness (not measured in our implementation-effectiveness trial) and students’ perceived support from their teachers at 24-months, but no effects for other self-reported outcomes. However, we observed a trend toward improvements in well-being in our cluster RCT, which was confirmed in this larger study. We suggest that the large sample size included in our implementation-effectiveness study provided additional statistical power to detect small, but significant intervention effects. There is large variability in the way that effectiveness is calculated in dissemination studies, and student-level data are rarely collected [9]. For example, it is not uncommon for studies to refer to effectiveness data from a previous trial [44], and focus on implementation evaluation outcomes. Other studies have examined within group effects using a sub-study of participants from the larger dissemination study [43].

Of the 115 schools involved in our implementation-effectiveness trial, most were located in major cities, with almost a third from inner regional areas, and the remaining from outer regional areas. Our rate of adoption is lower than has been reported in previous school-based physical activity intervention dissemination studies [43, 46]. This may be attributed to a range of factors. First, rates of adoption have been measured in a variety of ways in previous studies. For instance, the Exercise Your Options (EYO) [46] and Resistance Training for Teens (RT for Teens) [43] programs were adopted by 42% and 46% of secondary schools in California and NSW, respectively. Adoption of the Exercise Your Options program was calculated as the proportion of middle school teachers who ordered the program materials. Similarly, adoption was based on the number and representativeness of schools with one or more teachers trained to deliver the RT for Teens program. Neither of these approaches reflect the level of commitment that was required in our implementation-effectiveness trial. Second, the high rates of adoption reported in the CATCH (since 1997) [45] and SPARK (since 1994) [42] programs are a direct reflection of the long period of time that these programs have been available to schools. Finally, iPLAY is a whole-of-school intervention that requires commitment from school principals and teachers. By comparison, programs such as RT for Teens, can be adopted by schools at the discretion of the Physical Education department.

We also examined the proportion of teachers and leaders who completed the iPLAY training modules as a measure of adoption. Similar to our RCT findings [18], 67% of our teachers completed at least 50% of the iPLAY professional learning modules (70% in the RCT). The typical completion rates for online professional learning are low, often due to low organisational support, insufficient provision of time, low perceived usefulness of content, and poor instructional design [47]. The high-levels of adoption observed in this implementation-effectiveness study may be attributable to our learning design choices that deliberately addressed these barriers (e.g., short online modules that fit in existing structures for professional learning opportunities, like team meetings). Leaders completing the core learning modules was also very high in our implementation-effectiveness trial (88%) and the RCT (100%). It is not surprising that these individuals were more motivated to complete the allocated training. They were more closely supported by iPLAY mentors and were accountable for supporting other teachers in their schools. Although rates of completion are higher for face-to-face professional learning opportunities, online training is more scalable [18]. Researchers and public health practitioners need to strike a balance between scalability and effectiveness. The implementation science literature is rife with discussion of the ‘adaptation-fidelity dilemma’ [10, 36, 48]. At scale, there must be a balance between the need to adapt an intervention to achieve best fit for a specific setting and delivery partner, while maintaining fidelity to the intervention as planned and delivered at smaller scale. Consistent with the iPLAY approach, we believe that online training modules should be supplemented with face-to-face support from external mentors, who provide support and accountability.

Consistent with the RE-AIM framework, we operationalised implementation at the ‘setting level’ and our key focus was teachers’ fidelity to delivering the six intervention components (i.e., quality physical education, (ii) classroom energizers, (iii) active homework], and three non-curricular [(iv) active playgrounds, (v) parental engagement, and (vi) community links) as intended. We demonstrated that iPLAY was delivered as intended in most schools (i.e., delivery of > 50% of curricular and non-curricular strategies). As interventions are delivered at increasing scale it creates a ‘dynamic tension’ between retaining fidelity to the intervention and adapting intervention components and program delivery to meet the needs of different delivery partners and different contexts [49]. At 24-months, 59% of teachers reported delivering at least 150 min of PE each school/week. Ninety-two percent (92%) of teachers’ PE lessons were rated ≥ 3.0 (out of 5) on the SAAFE evaluation checklist by our mentors. In addition, 48% of teachers reported delivering ≥ 10 classroom energizers/week and, and 75% doing one active home task/week. Schools’ implementation of the non-curricular intervention components ranged from 96% of schools applying for the Sporting Schools funding to 9% of schools having at least one teacher gain a coaching accreditation with a recognised sporting organisation. Rates of curricular and non-curricular implementation fidelity in our implementation-effectiveness trial were almost identical to those observed in our cluster RCT [18]. This is an important finding and provides evidence that our intervention design minimised ‘program drift’ and subsequent ‘voltage drop’ that typically occur as interventions progress from effectiveness to dissemination [11].

We conducted interviews with teachers, leaders, and principals to determine the extent to which iPLAY was maintained in schools. One of the most consistent points raised by interviewees was that iPLAY changed teacher practices and school culture. This was evidenced through a shift towards a greater emphasis on the programming of quality PE and sport. In most schools, there was clear evidence that iPLAY resources were still being used and integrated into school planning, particularly classroom energizer breaks after the first year of the intervention. Teachers found that energizer breaks were easy-to-implement and effective in helping students to focus in the classroom. It is perhaps not surprising that maintenance of iPLAY was greater in schools that had higher rates of adoption (i.e., higher completion rates of professional learning). This may be due to enhanced knowledge and skills acquired by teachers during professional development, and the associated feelings of confidence and competence to continue program delivery. Finally, principals noted the support (or lack of support) from leaders and mentors as being integral to influencing the long-term maintenance of iPLAY in schools.

Limitations

It is important to note that our study was designed before the STaRI [21] and other guidelines for conducting implementation research were published [50, 51]. As such, there is now more guidance available for researchers conducting implementation research. Nevertheless, there are some study limitations that should be noted. First, our implementation-effectiveness trial was retrospectively registered, and we made some changes to our methods as the project progressed. For example, it seemed more appropriate to compare students in the implementation-effectiveness trial with those in the control group from the cluster RCT. We considered that an extant control group provided us the opportunity to conduct a more robust assessment of effectiveness. Our original approach involved examining within group changes. Second, most of the data used to evaluate reach, effectiveness, adoption, implementation, and maintenance were self-reported by participants. Nevertheless, these data were complemented by iPLAY mentors directly observing lessons, and from objective measures of website usage. Third, our effectiveness data were collected in a sub-sample of schools (~ 10%) and not all students who were assessed at baseline completed the 12- and 24-month assessments. Rates of drop-out are unlikely to have any effect on our findings, as mixed models are robust to missing data [33]. Finally, we did not specifically measure how teachers adapted the iPLAY intervention. Adaptation is deemed both inevitable and appropriate in dissemination studies [36], and would ideally be monitored and evaluated in future.

Conclusions

Ours is one of very few whole-of-school physical activity interventions implemented at-broad scale that comprehensively assessed both effectiveness and implementation. We have demonstrated that iPLAY can be successfully scaled up using face-to-face and online learning, and support from an external mentor to reach more than 30,000 students. We also demonstrated that voltage drop is not inevitable when an intervention is implemented at scale, as teachers in 50% of schools were able to retain intervention fidelity. Importantly, positive changes in teacher practices and school culture were maintained over the longer term. We acknowledge that the cost of scaling-up school-based intervention studies are prohibitive in most cases, and require external funding support (e.g., from government). Studies that examine implementation strategies that minimise economic costs of scaling-up effective whole-of -school interventions, while retaining student level benefits are urgently needed.

Availability of data and materials

Study data and materials are not available publicly, however may be available upon request to the lead investigators (DRL and CL). All consenting participants were issued a unique identification number for confidentiality, and all data is stored securely as per ethical requirements.

Abbreviations

CATCH:

Child and Adolescent Trial for Cardiovascular Health

CRF:

Cardiorespiratory fitness

DoE:

Department of Education

iPLAY:

Internet-based Professional Learning to help teachers to promote Activity in Youth

NSW:

New South Wales

PE:

Physical education

RCT:

Randomised controlled trial

RE-AIM:

Reach, effectiveness, adoption, implementation, maintenance

RT for Teens:

Resistance Training for Teens

SAAFE:

Supportive, Active, Autonomous, Fair, and Enjoyable

SES:

Socio-economic status

SPARK:

Sports, Play, and Active Recreation for Kids

US:

United States

References

  1. Chaput JP, et al. 2020 WHO guidelines on physical activity and sedentary behaviour for children and adolescents aged 5–17 years: summary of the evidence. Int J Behav Nutr Phys Act. 2020;17(1):1–9.

    Article  Google Scholar 

  2. Kohl HW, et al. The pandemic of physical inactivity: global action for public health. Lancet. 2012;380(9838):294–305.

    Article  PubMed  Google Scholar 

  3. Aubert S, et al. Global matrix 3.0 physical activity report card grades for children and youth: results and analysis from 49 countries. J Phys Act Health. 2018;15(s2):S251–73.

    Article  PubMed  Google Scholar 

  4. Hills AP, Dengel DR, Lubans DR. Supporting public health priorities: recommendations for physical education and physical activity promotion in schools. Progress Card Dis. 2015;57(4):368–74.

    Article  Google Scholar 

  5. Centers for Disease Control and Prevention. Comprehensive school physical activity programs: a guide for schools. Atlanta, GA: U.S. Department of Health and Human Services; 2013.

  6. Milton K, et al. Eight investments that work for physical activity. J Phys Act Health. 2021;18(6):625–30.

    Article  PubMed  Google Scholar 

  7. van Sluijs EMF, et al. Physical activity behaviours in adolescence: current evidence and opportunities for intervention. Lancet. 2021;398(10298):429–42.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Reis RS, et al. Scaling up physical activity interventions worldwide: stepping up to larger and smarter approaches to get people moving. Lancet. 2016;388(10051):1337–48.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Kennedy SG, et al. Implementation at-scale of school-based physical activity interventions: a systematic review utilizing the RE-AIM framework. Obes Rev. 2021;22:e13184.

    Article  PubMed  Google Scholar 

  10. Milat AJ, et al. The concept of scalability: increasing the scale and potential adoption of health promotion interventions into policy and practice. Health Promot Int. 2013;28(3):285–98.

    Article  PubMed  Google Scholar 

  11. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Impl Sci. 2013;8(1):117.

    Article  Google Scholar 

  12. Beets M, et al. Identification and evaluation of risk of generalizability biases in pilot versus efficacy/effectiveness trials: a systematic review and meta-analysis. Int J Behav Nutr Phys Act. 2020;17(1):19.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Lane C, et al. How effective are physical activity interventions when they are scaled-up: a systematic review. Int J Behav Nutr Phys Act. 2021;18(1):1–11.

    Article  Google Scholar 

  14. Lubans DR, et al. Rationale and study protocol for the Supporting Children’s Outcomes using Rewards, Exercise and Skills (SCORES) group randomized controlled trial: A physical activity and fundamental movement skills intervention for primary schools in low-income communities. BMC Pub Health. 2012;12:427.

  15. Cohen K, et al. Physical activity and skills intervention: SCORES cluster randomized controlled trial. Med Sci Sports Exerc. 2015;47(4):765–74.

    Article  PubMed  Google Scholar 

  16. Damschroder LJ, et al. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Lonsdale C, et al. Scaling-up an efficacious school-based physical activity intervention: Study protocol for the 'Internet-based Professional Learning to help teachers support Activity in Youth' (iPLAY) cluster randomized controlled trial and scale-up implementation evaluation BMC Pub Health. 2016;16.

  18. Lonsdale C, et al. Effect of a scalable school-based intervention on cardiorespiratory fitness in children: a cluster randomized clinical trial. JAMA Peds. 2021;175:680–8.

    Article  Google Scholar 

  19. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: The RE-AIM framework. Am J Pub Health. 1999;89(9):1322–7.

    Article  CAS  Google Scholar 

  20. Curran GM, et al. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Pinnock H, et al. Standards for reporting implementation studies (StaRI) statement. BMJ. 2017;356:i6795.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Lubans DR, et al. Framework for the design and delivery of organized physical activity sessions for children and adolescents: Rationale and description of the ‘SAAFE’ teaching principles. Int J Behav Nutr Phys Act. 2017;14:24.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Raghuveer G, et al. Cardiorespiratory fitness in youth: an important marker of health. Circulation. 2020;142(7):e101–18.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Glasgow RE, et al. RE-AIM planning and evaluation framework: adapting to new science and practice with a 20-year review. Front Pub Health. 2019;7:64.

    Article  Google Scholar 

  25. Lam S, et al. Understanding and measuring student engagement in school: The results of an international study from 12 countries. School Psych Quart. 2014;29(2):213–32.

    Article  Google Scholar 

  26. McAuley E, Duncan T, Tammen VV. Psychometric properties of the intrinsic motivation inventory in a competitive sport setting: a confirmatory factor analysis. Res Quart Exerc Sport. 1989;60:48–58.

    Article  CAS  Google Scholar 

  27. Belmont M, et al. Teacher as social context: a measure of student perceptions of teacher provision of involvement, structure, and autonomy support. Rochester: University of Rochester; 1988.

    Google Scholar 

  28. Vlachopoulos SP, Katartzi ES, Kontou MG. Fitting multidimensional amotivation into the self-determination theory nomological network: application in school physical education. Meas Phys Educ Exerc Sci. 2013;17(1):40–61.

    Article  Google Scholar 

  29. Jang H, et al. Can self-determination theory explain what underlies the productive, satisfying learning experiences of collectivistically oriented Korean students? J Educ Psych. 2009;101(3):644–61.

    Article  Google Scholar 

  30. Ridgers ND, et al. Validity of a brief self-report instrument for assessing compliance with physical activity guidelines amongst adolescents. J Sci Med Sport. 2012;15(2):136–41.

    Article  PubMed  Google Scholar 

  31. Active Healthy Kids Australia. Is Sport Enough? The 2014 Active Healthy Kids Australia Report Card on Physical Activity for Children and Young People. Adelaide: Active Healthy Kids Australia; 2014.

  32. Roberts C, et al. The Health Behaviour in School-aged Children (HBSC) study: methodological developments and current tensions. Int J Pub Health. 2009;54(2):140–50.

    Article  Google Scholar 

  33. White IR, Carpenter J, Horton NJ. Including all individuals is not enough: lessons for intention-to-treat analysis. Clin Trials. 2012;9(4):396–407.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Elo S, Kyngäs H. The qualitative content analysis process. J Advan Nurs. 2008;62(1):107–15.

    Article  Google Scholar 

  35. Centre for Education Statistics and Evaluation. Schools and students: 2020 Statistical Bulletin. Sydney: NSW Department of Education; 2021.

    Google Scholar 

  36. Kennedy SG, et al. Implementation and scale-up of school-based physical activity interventions, in The Routledge Handbook of Youth Physical Activity, T. Brusseau, S.J. Fairclough, and D.R. Lubans, Editors. New York: Routledge; 2020. p. 438–460.

  37. Sutherland R, et al. Scale-up of the Physical Activity 4 Everyone (PA4E1) intervention in secondary schools: 12-month implementation outcomes from a cluster randomized controlled trial. Int J Behav Nutr Phys Act. 2020;17(1):100.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Lane C, et al. Optimising a multi-strategy implementation intervention to improve the delivery of a school physical activity policy at scale: findings from a randomised noninferiority trial. Int J Behav Nutr Phys Act. 2022;19:106.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Sims-Gould J, et al. Factors that influence implementation at scale of a community-based health promotion intervention for older adults. BMC Pub Health. 2019;19(1):1–12.

    Article  Google Scholar 

  40. Peters DH, Tran NT, Adam T. Implementation research in health: a practical guide. Alliance for Health Policy and Systems Research. World Health Organization; 2013.

  41. Goh TL, et al. Effects of a classroom-based physical activity program on children’s physical activity levels. J Teach Phys Educ. 2014;33(4):558–72.

    Article  Google Scholar 

  42. McKenzie TL, Sallis JF, Rosengard P. Beyond the stucco tower: Design, development and dissemination of the SPARK physical education programs. Quest. 2009;61:1–15.

    Article  Google Scholar 

  43. Kennedy SG, et al. Evaluating the reach, effectiveness, adoption, implementation and maintenance of the Resistance Training for Teens program. Int J Behav Nutr Phys Act. 2021;18:122.

    Article  PubMed  PubMed Central  Google Scholar 

  44. McKay HA, et al. Action Schools! BC implementation: from efficacy to effectiveness to scale-up. Br J Sports Med. 2015;49(4):210–8.

    Article  PubMed  Google Scholar 

  45. Hoelscher DM, et al. Dissemination and adoption of the Child and Adolescent Trial for Cardiovascular Health (CATCH): a case study in Texas. J Pub Health Manage Prac. 2001;7(2):90–100.

    Article  CAS  Google Scholar 

  46. Dunton GF, Lagloire R, Robertson T. Using the RE-AIM framework to evaluate the statewide dissemination of a school-based physical activity and nutrition curriculum:“Exercise Your Options.” Am J Health Prom. 2009;23(4):229–32.

  47. Lee J, et al. Influences on user engagement in online professional learning: a narrative synthesis and meta-analysis. Rev Educ Res. 2021;91(4):518–76.

    Article  Google Scholar 

  48. Sutherland RL, et al. An RCT to facilitate implementation of school practices known to increase physical activity. Am J Prev Med. 2017;53(6):818–28.

    Article  PubMed  Google Scholar 

  49. Aarons GA, et al. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Imple Sci. 2012;7(1):1–9.

    Google Scholar 

  50. Wolfenden L, et al. Designing and undertaking randomised implementation trials: guide for researchers. BMJ. 2021;372: m3721.

    Article  PubMed  PubMed Central  Google Scholar 

  51. McKay H, et al. Implementation and scale-up of physical activity and behavioural nutrition interventions: an evaluation roadmap. Int J Behav Nutr Phys Act. 2019;16(1):102.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

We would like to thank the participating schools, students and teachers for their support and cooperation throughout the project. We also like to acknowledge Tara Finn and Kirsty Bergan for their roles in managing the iPLAY study.

Funding

This project was funded by a National Health and Medical Research Council (NHMRC) Partnership Project Grant (APP1114281) and a grant from the NSW Department of Education’s School Sport Unit. The NHMRC had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication. New South Wales Department of Education staff played an important role in the design of the intervention. No staff from the New South Wales Department of Education were involved in the collection, management, analysis, or interpretation of the data. A representative from the NSW Department of Education (JB) reviewed the draft manuscript and is included as a co-author. DRL is supported by an NHMRC Senior Research Fellowship (APP1154507). PP is supported by an Australian Research Council Discovery Early Career Researcher Award (DE140100080). JS is supported by a Leadership Level 2 NHMRC Investigator Grant (APP1176885). MM is supported by a NHMRC Centre for Research Excellence in Obesity Policy and Food Systems (APP1041020).

Author information

Authors and Affiliations

Authors

Contributions

CL and DL conceived the idea for the study and led the design of all aspects. CL, DRL, PP, PM, JS, HM and MM secured funding for the study. DRL and SGK drafted this manuscript. PP and TS conducted the analysis. CL, DRL, TS, MN, PM, JS, JB, AB and RCP contributed to the development of intervention materials. KM, AB, LP, VH, RC, TH, DV, JL, and DA participated in the acquisition of data. All authors contributed to reviewing, editing, and approving the final version of the paper.

Corresponding author

Correspondence to D R Lubans.

Ethics declarations

Ethics approval and consent to participate

Ethical approval for this study was provided by the Australian Catholic University (2014185) and University of Newcastle (H-2016–0135) human research ethics committees and the NSW Department of Education (SERAP2014260). Written informed consent was obtained from all school Principals and teachers. Students and their parents/caregivers were provided with information statements and opt-out consent was applied.

Consent for publication

Not applicable

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1:

Supplementary Table 1. iPLAY Teacher Interview Guide. Supplementary Table 2. Characteristics of students enrolled in implementation-effectiveness schools. Supplementary Table 3. Characteristics and baseline outcomes of students in the control and sub-sample of implementation-effectiveness schools. Supplementary Table 4. Effectiveness analyses for self-reported student outcomes by sex. Supplementary Table 5. Characteristics of schools enrolled in the implementation-effectiveness trial. Supplementary Table 6. Characteristics of teachers in the implementation-effectiveness schools.

Additional file 2:

Supplementary Figure 1. Implementation of curricular and non-curricular intervention components.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lubans, D.R., Sanders, T., Noetel, M. et al. Scale-up of the Internet-based Professional Learning to help teachers promote Activity in Youth (iPLAY) intervention: a hybrid type 3 implementation-effectiveness trial. Int J Behav Nutr Phys Act 19, 141 (2022). https://doi.org/10.1186/s12966-022-01371-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12966-022-01371-4