- Open Access
Identifying state-level policy and provision domains for physical education and physical activity in high school
© Hales et al.; licensee BioMed Central Ltd. 2013
Received: 10 December 2012
Accepted: 17 May 2013
Published: 1 July 2013
It is important to quickly and efficiently identify policies that are effective at changing behavior; therefore, we must be able to quantify and evaluate the effect of those policies and of changes to those policies. The purpose of this study was to develop state-level physical education (PE) and physical activity (PA) policy domain scores at the high-school level. Policy domain scores were developed with a focus on measuring policy change.
Exploratory factor analysis was used to group items from the state-level School Health Policies and Programs Study (SHPPS) into policy domains. Items that related to PA or PE at the High School level were identified from the 7 SHPPS health program surveys. Data from 2000 and 2006 were used in the factor analysis. RESULTS: From the 98 items identified, 17 policy domains were extracted. Average policy domain change scores were positive for 12 policy domains, with the largest increases for “Discouraging PA as Punishment”, “Collaboration”, and “Staff Development Opportunities”. On average, states increased scores in 4.94 ± 2.76 policy domains, decreased in 3.53 ± 2.03, and had no change in 7.69 ± 2.09 policy domains. Significant correlations were found between several policy domain scores.
Quantifying policy change and its impact is integral to the policy making and revision process. Our results build on previous research offering a way to examine changes in state-level policies related to PE and PA of high-school students and the faculty and staff who serve them. This work provides methods for combining state-level policies relevant to PE or PA in youth for studies of their impact.
In the United States, state and local governments have far-reaching responsibilities for public schools and the youth attending those schools, including their health and welfare. In recent years growing concerns about the epidemic of childhood obesity and low levels of physical activity (PA) have prompted the establishment of a large number of legislative and regulatory actions that aim to, directly or indirectly, increase PA in schools. In 2011, 41 states and the District of Columbia (DC) had legislation introduced that was related to PA or Physical Education (PE) in schools (Database of State Legislative and Regulatory Action to Prevent Obesity and Improve Nutrition and Physical Activity, accessed Jan 2012). While previous research has shown that some state-level legislation and local policies are positively related to PE time and PA levels of students [1–5] there is little empirical support for many of the legislative actions that are pending or have been enacted. This includes support for legislative action directly related to PA (e.g. allowing community access to school playgrounds and field) and legislation more peripheral to PA levels (e.g. creating a model framework for teacher and principal evaluation instruments or requiring public meetings about education issues). Without evidence for effectiveness it is not known which policy actions are useful and which are ineffectual, placing an undue burden on a system with limited resources.
As state budgets tighten, it becomes increasingly important to quickly and efficiently identify policies that are effective. This requires methods to quantify policies and policy change in a meaningful way to allow careful evaluation of implemented policies. This measurement task is difficult due to the large numbers and types of policies, many of which are strongly related to each other in terms of their specific goal, target behavior and/or agent of change. While policies can be evaluated one-by-one, it seems obvious that related policies will interact with each other in real life settings and that examining each policy individually could yield misleading results. Indeed, previous research in this area has suggested that due to the complexity and reach of state-level legislation it may be more effective to evaluate changes in policy factors or domains defined as combinations of individual policies that may overlap and tend to change and act together .
We are aware of two systems or policy scoring mechanisms that have been developed to group and quantify school-level PA and/or PE policies [6, 7]. One of these, the Physical Education and Recess State Policy Classification System (PERSPCS), was developed to access the “nature and extent” of state-level PE statutes and regulations in six areas: PE time, PA time, staffing, curriculum, assessment and recess . The system uses a rating scale (e.g. 0 to 4) that allows each policy area to be “graded” based on the strength, specificity and comprehensiveness of the legislation. Summary and area specific (e.g. PE Time, curriculum) scores can be computed for elementary, middle, and high school levels and for all grade levels combined. Currently, state-level ratings are available from 2003 to 2008 and 2010. While the development of this system was an important move forward, it is somewhat limited in scope, covering only a few policy domains, and may require specialized legal training to grade policy areas accurately.
A second policy scoring system was developed as a comprehensive measure of state-level, school-based obesity prevention policies using data collected as part of the 2006 School Health Policies and Programs Study (SHPPS; ). At the state-level, the purpose of SHPPS is to provide data that can be used to describe policies and programs from seven school health program components. Nanney et al. created a PA policy scoring system using 146 items from the PA and PE components of SHPPS. These items were grouped into 10 policy domains using principal components factor analysis, expert opinion, and the relationships among items and policy domains. This approach capitalized on the large number of policy and provision items to construct policy domain scores that combined multiple items to create robust measures of important policy areas. Policy domain specific and an overall summary score were computed using the proportion of policies characterized as “required” (score = 1). Despite several strengths, the system lacks grade specific policy domain scores, which are useful because PE requirements and implementation are different across grade levels. In addition the policy domain scores were developed using only items from the 2006 version of the SHPPS survey, making it difficult to use them to evaluate the frequency and impact of policy change if item content and response options change from one administration to the next.
In this paper we build upon this previous research to develop state-level high-school PE/PA policy domain scores specifically designed with a focus on policy change. We use information from both the 2000 and 2006 SHPPS surveys to identify the policy domains that can be used to assess change over that period. We describe a set of policy domain scores that can be computed using surveillance data collected as part of the SHPPS survey and present State-level policy domain scores and change. Exploratory factor analysis was used to identify groups of items or variables that were statistically related and together represented a concept or domain of interest. Items that grouped together have shared variance and can be combined, or modeled, as a single variable. This combination of information from multiple related items generally results in more robust variables and simplified statistical models that are representative of the relationships among the individual items but easier to interpret and apply to processes like policy evaluation.
2000 and 2006 SHPPS data
Data for this study are from the 2000 and 2006 SHPPS [8–11]. This national survey is conducted by the Centers for Disease Control and Prevention every 6 years, and is designed to collect information on school health policies (e.g. Has your state adopted a policy…) and practices (e.g. Has your state provided funding or offered…) at the state, district, school, and classroom levels. For this work we use only state-level data for high schools. Although SHPPS provides data for many grade levels, this analysis was limited to high school to allow for future comparisons with the PA YRBS data, which is only available for high school students.
In the SHPPS survey, “policy” is defined as:
“any law, rule, regulation, administrative order, or similar kind of mandate issued by the state board of education, state legislature, or other state agency with authority over schools in your state.”
SHPPS data were collected through computer-assisted telephone interviews or self-administered mailed questionnaires from state personnel who are considered most knowledgeable about the relevant policy area. In 87% of states, the PE component of the survey was completed by the self-identified coordinator of PE. All states and the District of Columbia (included in the term “states” from here on) participated in SHPPS in both 2000 and 2006.
State-level policy domains
Policy domains were developed using the results from several exploratory factor analysis models, item grouping from the SHPPS survey, and item/scale psychometrics. Analyses were conducted separately for data from the 2000 and 2006 SHPPS using available information from all 51 states. A summary of item selection, item grouping, and the final policy domains can be found in Figure 1.
All items were scored on a two (NO/YES) or three (NO, Recommend, Require) level scale. Details on the SHPPS scoring system are available in the technical documentation for the survey . For the purpose of this project items were scored 0 for no policy or 1 for presence of a policy. Several items included a middle category, recommend/encourage; this was scored 0.5 to simplify the creation of factors. The ratio of the sample (51) to items (98) was small, which could reduce stability of the exploratory factor analysis results . Therefore, we initially grouped items based on the structure of the SHPPS survey and previous research . The grouping resulted in 12 exploratory factor analysis models with sample to item ratios ranging from 2.5:1 to 10:1. It was expected that by increasing this ratio the results of the exploratory factor analysis would be more stable.
One of the goals of this project was to develop policy domains that could be used to examine policy change. To ensure this, decisions about item retention and factor selection were done systematically using both sets of results (2000 and 2006). The final factors from SHPPS 2000 contained the same set of items as the final factors from SHPPS 2006. During this process the exploratory factor analysis for a group of items was conducted in both samples. Results were then compared. Any item with no factor loading (correlation between the factor and the variable) greater than 0.40 in either sample was removed, and the exploratory factor analysis was repeated. The next steps involved identifying individual items that that did not fit well at one of the time points. These items were removed individually with the rule that final factors had to have the same items in both data sets. Most items were excluded due to low factor loadings (< 0.40) or large cross loadings (correlation with another factor) (> 0.40). For several factors, the final models produced estimates with negative error variance for an item. While not ideal, the occurrence of Heywood cases, items with negative variance estimates, is not unexpected given the size of the sample . In each of these cases the final model and items were inspected for over-factoring and relationships among the items were examined using correlations, Cronbach’s alpha, and item-total correlations. All exploratory factor analyses were conducted using a robust weighted least squares estimator (WLSMV), Geomin rotation, and variables classified as categorical. MPLUS v6 was used for these analyses.
State-level policy domain changes
Summaries and comparisons of policy domain and policy domain change scores were estimated using SAS v9.2. Scores were computed for each policy domain using the 2000 and 2006 data and Cronbach’s alpha was calculated . Policy domain change scores were computed as (score 2006 – score 2000), and were considered “no change” from 2000 to 2006 if the value changed by less than 20% of the policy domain change score standard deviation. Most states with no change had policy domain change scores of 0.
State-level policy domains
Summary of policy domains, factor loadings, internal consistency (alpha), policy domain scores (mean, SD), and policy domain change score (mean, SD)
Policy domain (# items)
Factor loadings (min-max)
Alpha (2000/ 2006)
Policy domain scores
Has your state adopted a policy stating that head coaches of interscholastic sports will have a teaching certificate? (YES/NO)
During the past 12 months, have state physical education staff worked on physical education activities with staff or members from state-level health organizations (e.g. AHA, ACS)? (YES/NO)
0.464 - .995
Exemptions from PE religious or disability (3)
Based on policies adopted by your state, can senior high school students be exempt from physical education requirements for one grading period or longer for religious reasons? (YES/NO)
0.603 – 1.05
Exemptions from PE for school or sport participation (4)
Based on policies adopted by your state, can senior high school students be exempt from physical education requirements for one grading period or longer for participation in other school related activities? (YES/NO)
0.613 – 1.08
Require protective gear (3)
Has your state adopted a policy requiring that students wear appropriate protective gear when engaged in physical activities during physical education? (YES/NO)
0.68 – 1.06
Maintain or inspect PA facilities (3)
Has your state adopted a policy on the inspection or maintenance of playground facilities and equipment, such as playing surfaces, benches, monkey bars, and swings? (YES/NO)
0.844 – 1.03
Provide PE information or materials (5)
During the past 2 years, has your state education agency provided the following information or materials for senior high school physical education: Lesson plans or learning activities for physical education? (YES/NO)
Discourage physical activity as punishment (4)
Has your state adopted a policy that prohibits schools from using physical activity (e.g. laps or push-ups) to punish students for bad behavior in physical education? (YES/NO)
0.740 – 1.0
Implementation of adaptive PE (5)
Has your state adopted a policy stating that schools will implement the following measures to meet the physical education needs of students with permanent physical or cognitive disabilities: Providing adaptive physical education as appropriate? (YES/NO)
0.863 – 0.991
Staff development opportunities (14)
During the past 2 years, has your state education agency provided any funding for or offered staff development on each of the following topics to those who teach physical education (including workshops, conferences, continuing education, graduate courses, or other in-kind service: Encouraging family involvement in physical activity? (YES/NO)
Standards and compliance for PE (6)
Which of the following methods does your state education agency use to improve district or school compliance with physical education standards or guidelines: Submission of written reports by districts or schools? (YES/NO)
Testing requirements for PE (6)
Does your state education agency require or recommend that senior high schools test students’ fitness levels? (Require, Recommend, Neither)
0.637 – 0.997
Goals and objectives for PE (5)
Do the goals or objectives for senior high school physical education specifically address each of the following student outcomes: Regular participation in physical activity? (YES/NO)
0.789 – 1.16
Physical activity promotion: faculty and staff (3)
During the past 12 months, has your district provided any funding for or sponsored each of the following services or programs for school faculty and staff: Physical activity and fitness counseling? (YES/NO)
State certification for PE teachers (2)
Which of the following types of certification, licensure, or endorsement does your state offer for physical education teachers: physical education for senior high school? (YES/NO)
Requirement when hiring new PE teachers (2)
Has your state adopted a policy stating that newly hired staff who teach physical education at each of the following levels will have undergraduate or graduate training in physical education or a related field: Senior high school? (YES/NO)
0.75 – 1.0
Teaching and time requirement for PE (2)
Has your state adopted a policy that senior high schools will teach physical education? (YES/NO)
Cronbach’s alpha ranged from a low of 0.54 for “Exemptions from PE: religious or disability” (PD3) in 2000 to a high of 0.99 for “Goals and Objectives for PE” in 2000. About 67% of the policy domains had alpha values greater than 0.75 and all but one alpha was greater than 0.60. On average the alphas only differed slightly between years, 0.07 units, with 10 higher in 2000 and 7 higher in 2006. The largest difference between alphas at the two time points was about 0.2 units for “Exemptions from PE: religious” and “Provide PE information”. The alpha for “Physical Activity Promotion for Staff” could not be computed in 2000 because two of the three items had zero variance.
State-level policy domain changes
A summary of policy domain changes in each state is available in Additional file 2. Most states were missing very few policy domain change scores. Twenty-five states were missing 0, seventeen missing 1, six missing 2, and two states were missing 3 policy domain change scores. Mississippi was, however, missing 8 of 17 policy domain change scores due to incomplete data from the SHPPS in 2000. On average, states increased scores in 4.94 ± 2.76 policy domains, decreased in 3.53 ± 2.03, and had no change in 7.69 ± 2.09 policy domains. In Utah, 13 of the 17 policy domain scores increased from 2000 and 2006, while Oregon, Pennsylvania, Delaware, and Nevada all saw increases in 9 policy domains. The fewest positive policy domain changes were seen in Montana (0), Ohio (0), Missouri (1), South Carolina (1) and Alabama (1).
Correlations among policy domains
Correlations between policy domain change scores (change from 2000 to 2006) [ below diagonal ] and cross-sectional correlations between policy domain scores in 2000 [ above diagonal ] and in 2006 [ above diagonal; in parenthesis ] (n = 45–51)
3. Exemptions from PE religious or disability
4. Exemptions from PE for sport participation
5. Require protective gear
6. Maintain or inspect PA facilities
7. Provide PE information/materials
8. Discourage physical activity as punishment
9. Implementation of adaptive PE
10. Staff development opportunities
11. Standards and compliance for PE
12. Testing requirements for PE
13. Goals and objectives for PE
14. PA promotion: faculty and staff
15. State certification for PE teachers
16. Requirement when hiring new PE teachers
17. Teaching and time requirement for PE
Quantifying policy change and its impact is integral to the policy making and revision process. Building on previous work in this area, the results of this study were used to identify a set of 17 policy domains. They were developed to be specific to high-schools and to contain the same information over time, enhancing our ability to examine change in policy. Data from two administrations of the SHPPS survey (2000 and 2006), a national policy surveillance instrument, were used. The resulting policy domain scores can be applied during the evaluation process to summarize policy change related to student behavior and will be useful in gaining a better understanding of the similarities and differences among specific policies and provisions for PA and PE. In addition, it will be interesting to see how policy change progresses in each policy domain by applying these results to data from the 2012 administration of the SHPPS survey.
State-level policy domains
Previous work in this area provided guidance in developing state-level PE and PA policy domains. In their work, Nanney and colleagues identified 10 policy domains using state-level policy and practice data for elementary, middle, junior, and senior high schools from SHPPS 2006 . Nine could be applied to senior high schools (walking to school was not applicable for high schools). Of these, five are similar to those identified in the current study. Three are nearly identical (Physical Activity as Punishment (PD8), Protective Gear (PD5), and Adaptive PE (PD9)), while Testing (PD12) and Collaboration (PD2) are similar to the Assessment and Collaboration policy domains identified by Nanney et al. (2010), but contain fewer items. The difference in items is primarily due to the fact that in the previous study, items that applied to elementary, middle, and junior high school were included in the policy domain development. While some items and policy domains will be similar across grade level, we feel that grade-specific policy domain scores are useful for several reasons. First, PE requirements and implementation are quite different across grade levels. This means that while PE policies may be related for middle- and high-schools they are likely not the same. Therefore, a state-level policy domain score for “standards” that includes all grades may not truly reflect the strength or weakness in policy at a given grade level, making it more difficult to assess policy impact. Second, the available data on PE and PA participation for different aged students are often collected in different ways (e.g. High Schools collect self-report like the YRBS; elementary schools rely on observation or proxy report). This makes it difficult to compute the state-level behavioral outcomes needed for comparison to a general (all-grade levels) policy domain score. Finally, differentiation of policy effects may be particularly important during different developmental periods. For example, requiring more PE or PA in school may be most beneficial during early to middle adolescents when overall activity levels decline more rapidly, especially in girls . Having only general policy domain scores would make it hard, if not impossible, to identify potentially important effects of policy change during these influential periods.
The final two policy domains identified by Nanney et al., Standards and Training, included a large number of items. In our work several smaller, more specific policy domains were identified within these larger groups of items. For example, the previous study created one training policy domain with 38 items, including 27 related to high school. Our analysis suggested that they should be separated into policy domains related to “PE certification” (PD15, PD16), “Coaches training” (PD2), and “Staff development” (PD10). Looking at our correlational and state-level change results it seems that these policy domains are distinct. For the Standards policy domain Nanney and Colleagues identified 35 items, 10 of which apply to High School. Our results suggest that these items may not represent a single policy domain, but rather, “General PE standards” (PD11), “PE goals” (PD13), and “PE teaching/time requirements” (PD17). In our correlational results, “General PE standards” and “PE Goals” had the strongest relationship (r ~ 0.75). This suggests that these policy domains might be combined. Given the other data available, like item content, scatter plots, and policy domain change scores, it is difficult to tell if these factors should be merged or if they represent separate ideas and actions that are related but need to be differentiated. At this time we suggest that these policy domains be studied separately. Future research may show that these policy domains are related to behavioral outcomes or legislative change in similar ways, but for now they should be treated as distinct.
State-level policy domain changes
Averaged over all states, 11 of the 17 policy domain scores did not change meaningfully from 2000 to 2006. Similar information can be found in the PERSPCS data (http://class.cancer.gov/index.aspx). Their data showed that while average PE policy domain scores increased about 8%, most states (34 of 51) showed no change from 2003 to 2008. Looking at data from 2003 to 2006, dates which more closely match the SHPPS data used in this study, 43 states had zero change in PE policy domain scores. (; CLASS.cancer.gov data accessed Jan 2012). While the average policy domain score results are similar, our data showed more variation between states. In our sample, every state changed on at least 4 policy domains with most having substantial change on at least 8 policy domain scores. The difference between the PERSPCS data and our results is likely related to differences in data collection and content coverage.
The PERSPCS data and scoring focus on laws and regulations in six key areas which were systematically scored by trained researchers. In contrast, SHPPS data were self-reported, and covered a greater number of policy domains and included more policy and provision items. Often, important changes in policies and provisions for PA in high schools may be implemented without specific changes to state laws and regulations. When this occurs the PERSPCS system is unlikely to detect change. It should also be noted that while one study has concluded that reliability and validity evidence for the SHPPS data is acceptable , measurement error could be inflating the amount of change estimated in the new policy domains. At this point it is safe to say that both scoring systems are important to understanding the relationship between policy and PA. Future research should help to pinpoint where each is most useful and how policy domain scores from each relate to behavioral outcomes.
This research study benefited from the comprehensiveness of the data collected in the SHPPS survey, but the number of items compared to the number of respondents was less than ideal for factor development. This is the primary reason we conducted several smaller exploratory factor analysis models and used expert judgment and inter-item relationships when making final decisions about a specific policy domain or a questionable item. With only 51 possible respondents the robustness and usefulness of some domains could be questioned. We also recognize that the correlations between combinations of policies can be influenced by unmeasured policies or other unmeasured attributes. This type of problem is not unique to this analysis, but analyses of numerous combined policies in this area of study are relatively new, and important sources of bias and confounding may not yet be fully understood. We suggest that researchers continue to search for variables that influence associations between policies and their targets and that the policy domains proposed here be reevaluated after the SHPPS survey is re-administered in 2012.
Examining the effects of policy change on their intended targets is a major part of the policy evaluation-revision cycle. This research supports this type of future work by providing a means of examining changes in state-level policy domains related to PE and PA of high-school students and the faculty and staff that serve them. The results build on previous research to offer a new way to examine the effects of policy change on behaviors. Future research should to connect policy change not only to PE, but also overall PA, and to provide guidance to policy makers who seek ways to promote PA and health in children.
We thank Dr. Chris Baggett for his critical review of this manuscript.
- Barroso CS, Kelder SH, Springer AE, Smith CL, Ranjit N, Ledingham C, Hoelscher DM: Senate Bill 42: implementation and impact on physical activity in middle schools. J Adolesc Health. 2009, 45: S82-S90. 10.1016/j.jadohealth.2009.06.017.View ArticleGoogle Scholar
- Cawley J, Meyerhoefer C, Newhouse D: The impact of state physical education requirements on youth physical activity and overweight. Health Econ. 2007, 16: 1287-1301. 10.1002/hec.1218.View ArticleGoogle Scholar
- Durant N, Harris SK, Doyle S, Person S, Saelens BE, Kerr J, Norman GJ, Sallis JF: Relation of school environment and policy to adolescent physical activity. J Sch Health. 2009, 79: 153-159. 10.1111/j.1746-1561.2008.00384.x. quiz 205–156View ArticleGoogle Scholar
- Haug E, Torsheim T, Samdal O: Local school policies increase physical activity in Norwegian secondary schools. Health Promot Int. 2010, 25: 63-72.View ArticleGoogle Scholar
- Slater SJ, Nicholson L, Chriqui J, Turner L, Chaloupka F: The Impact of State Laws and District Policies on Physical Education and Recess Practices in a Nationally Representative Sample of US Public Elementary Schools. Arch Pediatr Adolesc Med. 2012, 166: 311-316. 10.1001/archpediatrics.2011.1133.View ArticleGoogle Scholar
- Nanney MS, Nelson T, Wall M, Haddad T, Kubik M, Laska MN, Story M: State school nutrition and physical activity policy environments and youth obesity. Am J Prev Med. 2010, 38: 9-16. 10.1016/j.amepre.2009.08.031.View ArticleGoogle Scholar
- Masse LC, Chriqui JF, Igoe JF, Atienza AA, Kruger J, Kohl HW, Frosh MM, Yaroch AL: Development of a Physical Education-Related State Policy Classification System (PERSPCS). Am J Prev Med. 2007, 33: S264-S276. 10.1016/j.amepre.2007.07.019.View ArticleGoogle Scholar
- Burgeson CR, Wechsler H, Brener ND, Young JC, Spain CG: Physical education and activity: results from the School Health Policies and Programs Study 2000. J Sch Health. 2001, 71: 279-293. 10.1111/j.1746-1561.2001.tb03505.x.View ArticleGoogle Scholar
- Lee SM, Burgeson CR, Fulton JE, Spain CG: Physical education and physical activity: results from the School Health Policies and Programs Study 2006. J Sch Health. 2007, 77: 435-463. 10.1111/j.1746-1561.2007.00229.x.View ArticleGoogle Scholar
- Kann L, Brener ND, Wechsler H: Overview and summary: School Health Policies and Programs Study 2006. J Sch Health. 2007, 77: 385-397. 10.1111/j.1746-1561.2007.00226.x.View ArticleGoogle Scholar
- Kolbe LJ, Kann L, Brener ND: Overview and summary of findings: School Health Policies and Programs Study 2000. J Sch Health. 2001, 71: 253-259. 10.1111/j.1746-1561.2001.tb03502.x.View ArticleGoogle Scholar
- School Health Policies and Programs Study (SHPPS) 2000: a summary report. J Sch Health. 2001, 71: 251-350. 10.1111/j.1746-1561.2001.tb03500.x.Google Scholar
- Costello AB, Osborne JW: Best Practices in Exploratory Factor Analysis: Four Recommendations for Getting the Most from Your Analysis. Book Best Practices in Exploratory Factor Analysis: Four Recommendations for Getting the Most from Your Analysis. 2005, (Editor ed.^eds.), vol. 10. CityGoogle Scholar
- Cronbach LJ: Coefficient alpha and the internal structure of tests. Psychometrika. 1951, 16: 297-334. 10.1007/BF02310555.View ArticleGoogle Scholar
- Kimm SY, Glynn NW, Kriska AM, Barton BA, Kronsberg SS, Daniels SR, Crawford PB, Sabry ZI, Liu K: Decline in physical activity in black girls and white girls during adolescence. N Engl J Med. 2002, 347: 709-715. 10.1056/NEJMoa003277.View ArticleGoogle Scholar
- Brener ND, Kann L, Smith TK: Reliability and validity of the School Health Policies and Programs Study 2000 questionnaires. J Sch Health. 2003, 73: 29-37. 10.1111/j.1746-1561.2003.tb06556.x.View ArticleGoogle Scholar
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.