Skip to main content

Using automated active infrared counters to estimate footfall on urban park footpaths: behavioural stability and validity testing

Abstract

Background

Using infrared counters is a promising unobtrusive method of assessing footfall in urban parks. However, infrared counters are susceptible to reliability and validity issues, and there is limited guidance for their use. The aims of this study were to (1) determine how many weeks of automated active infrared count data would provide behaviourally stable estimates of urban park footfall for each meteorological season, and (2) determine the validity of automated active infrared count estimates of footfall in comparison to direct manual observation counts.

Methods

Three automated active infrared counters collected daily footfall counts for 365 days on three footpaths in an urban park within Northampton, England, between May 2021 – May 2022. Intraclass correlation coefficients were used to compare the behavioural stability of abbreviated data collection schedules with total median footfall within each meteorological season (Spring, Summer, Autumn, Winter). Public holidays, events, and extreme outliers were removed. Ten one-hour manual observations were conducted at the site of an infrared counter to determine the validity of the infrared counter.

Results

At least four-weeks (28 days) of infrared counts are required to provide ‘good’ to ‘excellent’ (Intraclass correlation > 0.75, > 0.9, respectively) estimates of median daily footfall per meteorological season in an urban park. Infrared counters had, on average, -4.65 counts per hour (95% LoA -12.4, 3.14; Mean absolute percentage error 13.7%) lower counts compared to manual observation counts during one-hour observation periods (23.2 ± 15.6, 27.9 ± 18.9 counts per hour, respectively). Infrared counts explained 98% of the variance in manual observation counts. The number of groups during an observation period explained 78% of the variance in the difference between infrared and manual counts.

Conclusions

Abbreviated data collection schedules can still obtain estimates of urban park footfall. Automated active infrared counts are strongly associated with manual counts; however, they tend to underestimate footfall, often due to people in groups. Methodological and practical recommendations are provided.

Background

Restructuring physical environments (such as parks, woodlands and squares) is a promising intervention to increase population-level physical activity. Despite an abundance of cross-sectional evidence between features of the built environment and physical activity levels, there is a dearth of robust intervention-based evaluations [2]. The expectation to conduct robust evaluations of how environmental restructuring increases physical activity participation [20] has grown among Public Health and Government bodies in recent years. This heightened expectation has been particularly evident in England, with the release of policies such as ‘Gear Change: a bold vision for cycling and walking’ [8], ‘Active Travel Fund Monitoring Guidance 2020’ [9], and ‘Improving access to greenspace: a new review for 2020’ [28]. There has also been the establishment of Active Travel England, the executive agency who will act as the inspectorate and funding body for active travel schemes in England [34], as well as cross-Government investment in Green Social Prescription [35], and the launch of [24] Green Infrastructure Framework.

Due to researchers’ lack of control over environmental changes, the optimal study design to evaluate environmental restructuring is to make use of natural experiments. Natural experiments are real-world interventions that are not under the control of researchers and therefore, the exposure to the event or intervention of interest has not been manipulated by the researchers [7]. Researchers can design studies around a natural experiment to assess intervention effectiveness i.e. natural experimental studies.

Within natural experimental studies that have examined changes in physical activity, footfall monitoring is frequently utilised [12]. Footfall monitoring can be conducted using manual counts. However, the use of manual count methods can be resource intensive (i.e. cost of researcher time) and are at risk of sampling error due to the often short-observation window (i.e. four-days, four-hours per day) employed to make causal inferences about the effectiveness of an intervention.

An increasingly popular alternative is to use automated counts from an electronic device. Automated counter systems tend to offer a cheaper and less person-demanding monitoring solution in comparison to manual counts, facilitating longer-term monitoring of interventions. Automated counts are a particularly useful option for local government agencies who often have limited human and financial capacity to conduct evaluations. However, as the market grows for these automated tools, so does the need to assess the reliability and validity for use by researchers and local government agencies. There are several types of automated counters that can be deployed depending on the research question and the environment being studied. Pneumatic tubes have been widely used as a temporary traffic monitoring system, which allows for the distinction between motor vehicles and bicycles. Pneumatic tube systems have demonstrated strong explained variance in comparison to manual observations (r2 = 0.88 – 0.92) but tend to underestimate cycling counts by 6 – 57%, depending on location [16]. Machine Learning Video Camera systems that use publicly accessible traffic cameras have the potential to monitor pedestrians and people cycling in urban environments at scale but are normally limited to dense urban environments, such as town centres, instead of greenspaces [5]. Alternatively, Strava Inc. (San Francisco, USA) released Strava Metro access to local authorities for free during the coronavirus-19 pandemic to facilitate active travel planning. Strava Metro can provide counts along any walked or cycled pathway that is logged by Strava app subscribers, which provides greater flexibility to monitor footfall in any geographical location (greenspaces, urban, remote locations) and have shown moderate to strong correlations with manual observations of cycling in urban environments [15]. In greenspaces, preliminary data has suggested strong correlations between Strava Metro and automated active infrared counters (r = 0.75), but there was underestimation path use by 6,639 counts per month [30].

Both automated passive and active infrared counters could be suitable devices for greenspace and rural environment footfall monitoring as they have long battery life, can be easily attached to existing furniture (e.g., gates and fenceposts), and are affordable, although they are unable to distinguish between behaviours (walking and cycling) [18]. Passive infrared counters use a single sensor to detect changes in infrared radiation in their field of view, allowing them to detect humans and animals. Whereas active infrared counters use a gate-system with a transmitter and receiver to create an infrared beam across a path, which identifies the presence of a human or animal when the infrared beam connection between the transmitter and receiver is broken. Thirty-three passive infrared sensors were deployed across Ireland to determine changes in trail-use during the coronavirus-19 pandemic as part of the TrailGazers EU project [27], which aimed to create a framework of technologies to monitor footfall to assist with future planning and tourism management of rural environments and greenspaces. Despite the growing use of infrared counters, they can be susceptible to reliability and validity issues, such as; an inability to count individuals within a group, miscounting due to wildlife or foliage interferences, and vandalism [10]. Furthermore, there is limited guidance for using automated active infrared counters [10, 21].

It is currently unclear how much count data should be collected in order to produce behaviourally stable estimates of urban greenspace footfall especially in different meteorological seasons, due to variations in footfall related to weather conditions. Behavioural stability, a domain of reliability, represents the consistency of a behavioural outcome’s variability over time [13, 31]. Studies have already been conducted to determine the minimum observation days and durations to provide a behaviourally stable estimate footfall with manual observation count tools, such as SOPARC [6] and MOHAWk [3], but no studies, to the author’s knowledge, have been done using infrared counters.

To address these gaps, the aims of this study were to: (1) determine how many weeks of automated active infrared count data would provide behaviourally stable estimates of urban park footfall for each meteorological season, as physical activity levels are known to be highest in spring and summer [33], and (2) determine the validity of automated active infrared count estimates of footfall in comparison to direct manual observation counts. Each aim within the current research was considered under the condition that the automated active infrared counters were fully operational (equipment efficacy).

Methods

Research setting

The data collection for the current research took place within Delapré Park, Northampton, England. Northampton is ranked 125th most income deprived, of the 216 Local Authorities in England [25]. Twenty-two of the 133 neighbourhoods within Northampton were categorised as the top 20% most income-deprived in England, while 26 neighbourhoods were within the top 20% least income-deprived [25]. Findings from the 2019 Monitoring Engagement in the Natural Environment Survey suggested that people from Northamptonshire visit greenspaces 90 times per year, similar to the rest of England. They also spend on average 111.8 min per visit within greenspace, which is 25.7 min less than the rest of England [23].

Delapré Park is located south of Northampton town centre and the River Nene within the urban centre (Fig. 1). This park contains a mixture of land-use, including a Lake, Woods, Heritage Building, and Historic Battlefield. Previous MOHAWk observations within Delapré Park estimated that men (57%) and women (43%) use this park primarily for walking and running (43%) or dog walking (22%) [30]. The majority of park users were observed to be adults (77%) and of white ethnicity (82%) [30]. The park is used by local residents for leisure and active commuting as well as a range of events.

Fig. 1
figure 1

Index of Multiple Deprivation 2019 of Lower Layer Super Output Areas within Northampton, England. Black circle indicates the location of Delapré Park. Map provided freely without required permission by [22]

Automated active infrared counters

Six automated active infrared counters (DE outdoor bi-directional counter, SensMax Ltd, Riga, Latvia) were installed within Delapré Park as part of a wider project [29]. The counters were placed throughout the park on a 3 km circular walking route (Figs. 2 and 3). For the current study, data from three counters were used as there were no missing data during 365 days of monitoring that was caused by a known fault to the counter, such as mis-alignment of transmitter and receiver units, dead battery, and vandalism. The decision to only use the three counters that had a complete year-long dataset was to increase the generalisability of the findings for researchers or practitioners who may want to use these counters in their own projects. The use of a complete dataset allows the current research to investigate the efficacy of the counters i.e., when the counters are functioning, can they actually estimate seasonal footfall? Within the current study, a day of missing data was usually due to an external issue, such as counter vandalism, a dead battery, or mis-alignment of the transmitter and receiver (this occurred later in the project as the wooden posts, which the counters were attached to, began to warp). If these missing data days were included in the current study, then the findings would reflect counter effectiveness, which is less generalisable for researchers and practitioners as effectiveness is more susceptible to the differing contexts of study locations influencing the findings.

Fig. 2
figure 2

Automated active infrared counter locations within Delapré Park, Northampton, England. Yellow numbered circles indicate counter location and ID number for the counters used within the current study. White circles indicate counter locations as part of the wider project

Fig. 3
figure 3

Images of the three counter locations used for the current research. A is counter position 1. B is counter position 2. C is counter position 3

Counters were housed within an ABS plastic with IP68 protection outdoor housing case (SensMax Ltd, Riga, Latvia) and mounted to wooden posts using four wood screws at a height of 1.14 m, which could detect running, walking, cycling, and wheeling behaviours of people over 1.14 m tall, but could not distinguish between these behaviours. The wooden posts for each transmitter and receiver were installed between 2.0 and 2.45 m apart at ‘bottle-neck’ points on footpaths to increase the likelihood of footpath users passing through the counter system (Fig. 3). According to the manufacturer, the counters have a 95% counting accuracy when the transmitter and receiver are placed up to 2.0 m apart, and 1% of accuracy is lost for every additional metre. Data from each counter was downloaded every one-to-two weeks, to minimise data collection disruption due to faults or vandalism, using the SensMax DE Collector remote (SensMax Ltd, Riga, Latvia) and EasyReport 14.1 Pro software (SensMax Ltd, Riga, Lativa).

Counter 1 was located on a paved footpath that was adjacent to the driveway access for the Heritage building. Counter 2 was located at the bottom of a declined grass ramp and steps on the perimeter of the ‘South Lawn’. Counter 3 was located in the ‘Woods’ on a trodden mud path (Figs. 2 and 3). The counters provided counts for each direction of travel on the path from 4th May 2021 to 3rd May 2022. Directional counts were summed to create a total daily footfall for each counter. Total daily footfall was used as the behavioural stability outcome measure for the current research, while directional counts were used for validity testing. Ethical approval was granted by the University of Northampton Faculty Ethics Committee (approval code: 202102).

Manual observations for counter validation

Counter 1 was chosen to validate the active infrared counters as there was a physical bottleneck at this location, so footpath users had to pass through the counter. Ten one-hour manual observations were conducted by the lead author on Thursday 21st and Friday 22nd July 2022 (five observations per day), starting at 08:00 and finishing at 17:00, with a one-hour break in-between each observation.

The observer recorded the number of people who passed through counter 1, in each direction, during each observation period in order to validate the counts from the automated active infrared counter. The size of groups that passed through the counter was noted, as this has been known to cause underestimations in footfall due to people breaking the infrared beam simultaneously [10]. The sensor cannot capture movement lower than the height of where the infrared beam is placed, so any individuals who were observed to have a height lower than the height of the counter (1.14 m) were noted but were not included in the statistical analysis for this study. This is because the focus of the validation was to determine ‘true counts’ of the sensor, rather than the total footfall of the path. Furthermore, children in pushchairs or being carried by adults were not counted because they would also not be recorded by the sensor. The observation period on 21st July 2022 at 16:00 was abandoned due to a large group of 50 + people passing through the automated active infrared counter, which caused the observer to be uncertain about the total number of people who passed through the counter. Consequently, the 16:00 observation period was repeated on the following week (Thursday 28th July 2022).

Weather trends

To outline trends in weather conditions across meteorological seasons, daily mean temperature (degrees centigrade), daily mean wind speed (miles per hour), and daily total rainfall (inches) were monitored from a local weather station [37]. These real-time measures of weather were recorded as they have previously been associated with various measurements of physical activity [4].

Statistical analysis

Behavioural stability of abbreviated observational periods

For each counter, days determined to have extreme count outliers (i.e. a value more than three times the inter-quartile range (Q3–Q1) from the upper (Q3) or lower (Q1) quartile) were omitted from the analysis. To determine an explanation for each outlier, these outlier days were cross-referenced with dates of public holidays in England and site-specific events hosted at the park (e.g. a one-off running event), because the Heritage Building located within the park had historically experienced increases in footfall on these days. Any counts that occurred on public holidays were omitted from analysis for all three counters, whereas counts on event days were only omitted from the analysis for counters that reported an extreme count outlier on that day. These decisions were made to minimise the influence of inflated counts and improve the generalisability of the findings as other parks that may not host events or the country of the park may have different public holidays. All remaining extreme outliers were greater than the mean counts per day and were thus thought to be due to either foliage obscuring the counter, or purposeful tampering with a counter (e.g. waving a hand to break the beam frequently). This resulted in a total of 328 days (Counter 1), 339 days (Counter 2) and 341 days (Counter 3) of monitoring that were retained for analyses. The handling of outliers was made as the authors expected that these counters may be used in natural experimental studies or longitudinal monitoring of park footfall and within these projects, the inclusion of counts from public holidays or events could present an overinflation of park footfall and therefore, may be omitted or adjusted for. Furthermore, the decision to remove extreme outliers that were not due to public holidays or events was informed by author knowledge of the area through the hours of observations accrued by visiting the park throughout the study. This development of researcher knowledge of the study area aligns with recommended planning processes for natural experimental studies and the use of manual observation tools [2, 3].

Two-way mixed, single measure, consistency intraclass correlation coefficients (ICCs) were used to calculate the mean behavioural stability of total median counts of each counter for different abbreviated data collection schedules within each meteorological season (Spring, Summer, Autumn, Winter). Specifically, mean ICCs were calculated for all possible combinations of 1, 2, 3 and 4-weeks per season for each counter: there were 13 unique combinations for 1-week (as each season had a total of 13 weeks); 78 unique combinations for 2-weeks; 286 unique combinations for 3-weeks; and 715 unique combinations for 4-weeks. Weeks that included days of missing data were omitted from analysis as this would have caused incomplete data being used to calculate mean ICCs (e.g. only 3-weeks of data in a 4-week combination). The mean ICCs were then compared to the entire season. ICCs can be interpreted as  < 0.5 = poor; 0.5 – 0.75 = moderate; 0.76 – 0.9 = good; and  > 0.9 = excellent [14].

No adjustment for weather was made in the behavioural stability analyses because the weather is closely associated with meteorological seasons, and therefore segregating analyses into meteorological seasons would sufficiently account for weather variability. Analyses were performed using SPSS Statistics version 28.0 (IBM, New York, USA).

Validity of automated active infrared counters

Bland–Altman plots [1] were used to determine convergent validity of the automated active infrared counter in comparison to manual observation. Directional counts were used as separate data points and thus 10 observation periods × two directions of counts = 20 data points for analysis. A one-sample t-test was used to determine systematic bias in the difference between the two count methods. Limits of Agreement were calculated by multiplying the standard deviation of the difference by 1.96. A linear regression was used to determine proportional bias in the difference between the two count methods. Finally, a linear regression was conducted to determine the concurrent validity of hourly automated active infrared counts for hourly manual observation counts.

A forced entry single linear regression was used to determine whether the total number of groups that passed through a counter predicted the amount of difference between automated active infrared counts and manual observation counts.

Results

Weather trends

The average weather trends for each season are provided in Table 1, which have similar ranges to 1991 – 2020 climate periods for the area [19].

Table 1 Weather descriptive statistics for each meteorological season

Behavioural stability of abbreviated observational periods

Table 2 displays the mean ICCs and 95% Confidence Intervals (95% CI) for 1, 2, 3 and 4-week combinations for each counter, in comparison to meteorological season median daily counts. On average, for all three counters, collecting data on 4-weeks for each season can produce good or excellent consistency for median daily counts approaching that obtained by collecting data for the entire meteorological season. Autumn and Spring tended to require fewer weeks to obtain at least good consistency in comparison to the entire season (e.g. Counter 2 obtained good behavioural stability with just one-week) and had narrower confidence intervals, compared with the one-week for Summer and Winter.

Table 2 Behavioural stability estimates using the mean ICC for 1, 2, 3 and 4-weeks compared with the entire meteorological season for median daily counts

Validity of automated active infrared counters

Manual observation counts during the ten one-hour observation periods ranged from 3 to 67 counts, with the number of groups per observation period ranging from 0 to 19 groups. Automated active infrared counters were, on average, -4.65 counts per hour (95% Limits of Agreement -12.4, 3.14 counts, p < 0.001, Mean absolute percentage error 13.7%) lower than manual observation counts per one-hour observation period (23.2 ± 15.6, 27.9 ± 18.9 counts per hour, respectively, 16.7 percentage difference), demonstrating systematic bias. There was a negative association between mean counts per one-hour observation and the difference between automated active infrared and manual observation counts (β -0.19, 95% CI -0.26, -0.13 counts, p < 0.001, intercept 0.31, r2 = 0.71), suggesting proportional bias (Fig. 4). The number of groups per one-hour observation period explained 78% (r2 = 0.78) of the variance in the difference between automated active infrared and manual observation counts (β -0.60, 95% CI -0.75, -0.44 counts, p < 0.001, intercept -0.48; Fig. 5). Automated active infrared counts explained 98% (r2 = 0.98) of the variance in manual observation counts per one-hour observation period (β 1.21, 95% CI 1.13, 1.28 counts, p < 0.001, intercept -0.096; Fig. 6).

Fig. 4
figure 4

Bland–Altman plot displaying the difference between automated active infrared and manual observation counts across one-hour observation periods. Scatter points represent a directional count for one-hour observation periods (two directional counts per observation period). Dashed line represents the mean difference between automated active infrared and manual observation counts across one-hour observation periods, while dotted lines represent the 95% Limits of Agreement. P < 0.001

Fig. 5
figure 5

Relationship between the number of groups per one-hour observation period and the difference between automated active infrared and manual observation counts. Scatter points represent a directional count for one-hour observation periods (two directional counts per observation period). Dashed line represents the linear trendline (r2 = 0.78, p < 0.001). Solid line represents the line of unity

Fig. 6
figure 6

Relationship between automated active infrared counts and manual observation counts. Scatter points represent a directional count for one-hour observation periods (two directional counts per observation period). Dashed line represents the linear trendline (r2 = 0.98, p < 0.001). Solid line represents the line of unity

Discussion

Summary of key findings

The behavioural stability study found that four-weeks (28 days) of automated active infrared count data are required to estimate the median daily count for a meteorological season. The validation study indicated that automated active infrared counts are strongly associated (r2 = 0.98) with manual counts, albeit underestimating on average by -4.65 counts per one-hour observation period (Mean absolute percentage error 13.7%). The number of groups passing through a counter explained 78% of the variance in the difference between automated active infrared and manual counts, suggesting that automated active infrared counters struggle to identify group sizes.

Comparisons with other count methods

Previous monitoring of outdoor physical activity in parks, which used automated infrared counters to monitor footfall, have used a variety of weeks for data collection. Natural experimental studies, where a pre-post comparison was used, collected count data over periods of 19-weeks (five-months) [11] to eight-days [36] per time point. Meanwhile, a validity study for automated active infrared counters (TrailMaster TM1550, Lenexa, USA) to estimate footfall in Yosemite National Park, USA, used a 122-day observation period [26]. Trend monitoring studies have also employed varying lengths of observation periods ranging from two-months [17] to two-years [27]. Therefore, there seems to be little consensus on how many monitoring days per observation period is enough to provide estimates of footfall. The current study has begun the process of establishing a recommended minimum observation period for meteorological seasonal footfall estimates when using a SensMax DE active infrared counter (SensMax Ltd, Riga, Latvia). This is important because while it may be feasible for municipalities, national parks, or large-funded research projects to conduct continuous year-round monitoring of footfall, the costs of both labour and time resources associated with infrared counters may make continuous monitoring untenable for local government agencies. More research is required using counters from different manufacturers and different study locations (geographical, topography, climate, and socio-cultural contexts) before a generalisable consensus on infrared counter behavioural stability can be attained. Notably, in the validation data collection the SensMax DE only underestimated counts, which is due to the binary classification (count or no count) provided by this counter. Theoretically, the SensMax should only underestimate counts but, in the field, overestimation may occur due to foliage, large animals, or people intentionally breaking the infrared beam. Therefore, researchers should consider mitigation approaches to reduce these risks to overestimation of counts.

The current study found that SensMax DE bi-directional automated active infrared counters (SensMax Ltd, Riga, Latvia) underestimated total counts (-4.65 counts per one-hour observation period, Mean absolute percentage error 13.7%). This systematic bias represented a -16.7 percentage difference between automated active infrared counts and manual counts, regardless of the direction of travel, which is higher than integrated passive infrared inductive loop counters (-10.3 percentage difference in pedestrian counts; Eco-Multi Sensor, EcoCounter, Lannion, France) that have also undergone a 10-h validation comparison against manual counts [16] and have been used recently in a park natural experiment evaluation [11]. However, the SensMax DE automated active infrared counter does display a similar explained variance of manual counts (r2 = 0.98) in comparison to the Eco-Multi Sensor (r2 = 0.91 – 0.99) [16]. In comparison to the Trail-Master 1500 active infrared counter (Trail-Master, Lenexa, USA), the SensMax counter from the current study displays a similar explained variance of manual counts (SensMax: r2 = 0.98, Trail-Master: r2 = 0.99), as well as a similar regression coefficient gradient (SensMax: β 1.21, Trail-Master: β 1.15) [17]. Comparisons can also be drawn from the Trail-Master 1150 model (Trail-Master, Lenexa, USA) large-scale calibration study in Yosemite National Park, USA, which estimated an explained variance of manual counts between 0.96 – 0.99 and a regression coefficient gradient of 1.57 – 1.83 [26]. Therefore, the SensMax DE bi-directional automated active infrared counter (SensMax Ltd, Riga, Lativa) performs similarly to other infrared counter types that have been previously used to estimate footfall in parks. Thus, the findings of the current study suggest the SensMax DE can provide valid estimates of footfall and is recommended for use in park footfall monitoring research.

Practical considerations when using automated counters

Based on this study, there are a number of practical recommendations for researchers using automated counters. Automated counters often have substantial battery life; the SensMax DE counter uses two AA batteries and has a battery life of over one-year, including a memory to record up to 150 days of data, which means they can be deployed over long periods without maintenance by researchers. However, it is recommended that weekly site visits are conducted to download the data and check the counters for vandalism or disturbances (e.g. the counter peeling off the double-sided adhesive attachment in the outdoor housing case). In the current study, the sturdy outdoor housing case and the use of four woodscrews to fix the outdoor housing case to fenceposts made it difficult for vandalism and unintended damage to the counters. Across one-year of monitoring, there were only six cases of counter vandalism or damage. However, it is highly likely that vandalism will occur to counters, so researchers should have a contingency budget to replace and repair counters. Practical applications to reduce the risk of vandalism include: (1) embedding the outdoor housing case within the structure that it is attached to so, a saw cannot reach the woodscrews, the presence of the counters are less noticeable, and only the front of the case could be hit with an object, (2) ensure posters are erected around the monitoring site so visitors are informed that counting is occurring as well as what data is being collected, (3) promotion of the research to establish community awareness of the project, which can encourage visitors to act as a self-policing community who will deter vandals and report any suspicious activity to the researchers or police.

Furthermore, caution needs to be used if the counters are attached to wooden fence posts as the posts can warp during changing weather conditions, which causes the counters’ infrared beams to misalign. If misalignment of counters does occur due to the fixing site changing shape, then it is possible to realign the counters by changing the position of the counter within the outdoor housing case or loosening the woodscrews to change the angle of the outdoor housing case.

Strengths and limitations

A strength of this study was the use of year-long automated active infrared counter data from three separate counters, which provided up to 1,095 days of count records. This provided a large sample, therefore offering reassurance that the findings of this study are representative of the entire year and not due to sampling error. This study also demonstrates the effect of group presence on automated count bias, which had been previously assumed but not formally assessed [10]. We have provided new methodological and practical recommendations for researchers using automated active infrared counters in the growing field of natural experimental studies for urban environment interventions on outdoor physical activity.

The main limitation of this study is that data collection began in May 2021, which was during the coronavirus-19 pandemic. At this time point, England had entered Step 2 of coronavirus-19 lockdown easing. Non-essential retail and outdoor venues had reopened, but no indoor mixing between different households was allowed. These restrictions varied over the course of the study period. For instance, on 19th July 2021, most legal limits on social contact were removed and the final closed sectors were reopened [32]. Yet on 10th December 2021, England entered ‘Plan B’ restrictions by making face masks and contact tracing compulsory in most indoor settings and encouraging people to work from home [35]. Although there is nothing the researchers could do to overcome this limitation, it is likely that changes in coronavirus-19 restrictions and public opinions of the virus had some influence on the use of outdoor spaces [27] and the recorded counts in the current study. However, even with these potential fluctuations in counts caused by responses to coronavirus-19, the current study still demonstrated that only four-weeks of automated active infrared counts were needed to provide an estimate of median daily counts per meteorological seasonal. If footfall is presumed to be more consistent across the year without the effects of coronavirus-19 restrictions, then this may lead to fewer weeks of count data needed to obtain seasonal estimates. Furthermore, the authors made justified decisions regarding the handling of outliers however, the generalisability of the resultant findings may be limited depending on how future studies design their data collection and analysis protocol, such as the inclusion of public holidays, events, extreme outliers in their data sets. Population monitoring of physical activity in public park settings is subject to physical and socio-cultural contexts that are unique to the study location. Therefore, replication studies are required in differing contexts to build a consensus of how many weeks of infrared counter monitoring are required, in what contexts, to provide behaviourally stable estimates of footfall for each meteorological season.

Conclusion

This study provided novel insights into the application of automated active infrared counters for footfall monitoring in parks. Findings suggested that at least four-weeks of automated active infrared counter data was required to provide estimates of median daily counts per meteorological season in an English urban park. Even though automated active infrared counts underestimated manual counts, they were still strongly associated. The main cause of automated active infrared count error was due to the presence of groups walking through the counters. Further research is needed to provide behavioural stability estimates and validation of automated active infrared counters in different climates, localities, and socio-cultural contexts to build a robust evidence base that informs the appropriate use of infrared counters in different contexts.

Availability of data and materials

The dataset supporting the conclusions of this article is available in the University repository, Pure, https://pure.northampton.ac.uk/en/datasets/using-automated-active-infrared-counters-to-estimate-footfall-on-; https://doi.org/10.24339/3ef61812-75c7-4431-8423-7358d0c296f2.

Abbreviations

CI:

Confidence intervals

ICC:

Intraclass correlation coefficient

IQR:

Interquartile Range

MOHAWk:

Method for Observing pHysical Activity and Wellbeing

Q:

Quartile

SOPARC:

System for Observing Play and Recreation in Communities

References

  1. Altman DG, Bland JM. Measurement in Medicine: The Analysis of Method Comparison Studies. The Statistician. 1983;32:307. https://doi.org/10.2307/2987937.

    Article  Google Scholar 

  2. Benton JS, Anderson J, Hunter RF, French DP. The effect of changing the built environment on physical activity: a quantitative review of the risk of bias in natural experiments. Int J Behav Nutr Phys Act. 2016;13(1):107. https://doi.org/10.1186/S12966-016-0433-3.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Benton JS, Anderson J, Pulis M, Cotterill S, Hunter RF, French DP. Method for Observing pHysical Activity and Wellbeing (MOHAWk): validation of an observation tool to assess physical activity and other wellbeing behaviours in urban spaces. Cities & Health 2020;6(4):1–15. https://doi.org/10.1080/23748834.2020.1775383.

  4. Chan CB, Ryan DA. Assessing the effects of weather conditions on physical activity participation using objective measures. Int J Environ Res Public Health. 2009;6:2639–54. https://doi.org/10.3390/ijerph6102639.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Chen L, Grimstead I, Bell D, Karanka J, Dimond L, James P, et al. Estimating Vehicle and Pedestrian Activity from Town and City Traffic Cameras. Sensors. 2021;21:4564. https://doi.org/10.3390/s21134564.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  6. Cohen DA, Setodji C, Evenson KR, Ward P, Lapham S, Hillier A, et al. How much observation is enough? Refining the administration of SOPARC. J Phys Act Health. 2011;8:1117–23. https://doi.org/10.1123/jpah.8.8.1117.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Craig P, Cooper C, Gunnell D, Haw S, Lawson K, Macintyre S, et al. Using natural experiments to evaluate population health interventions: New medical research council guidance. J Epidemiol Community Health (1978). 2012;66:1182–6. https://doi.org/10.1136/jech-2011-200375.

    Article  Google Scholar 

  8. Department for Transport. Gear Change A bold vision for cycling and walking. London: England; 2020.

    Google Scholar 

  9. Department for Transport. Active Travel Fund Monitoring Guidance Connecting people and places. London: England; 2020.

    Google Scholar 

  10. Granner ML, Sharpe PA. Monitoring physical activity: Uses and measurement issues with automated counters. J Phys Act Health. 2004;1:131–41.

    Article  Google Scholar 

  11. Grunseit A, Crane M, Klarenaar P, Noyes J, Merom D. Closing the loop: Short term impacts on physical activity of the completion of a loop trail in Sydney, Australia. Int J Behav Nutr Phys Act. 2019;16:1–12. https://doi.org/10.1186/S12966-019-0815-4/TABLES/2.

    Article  Google Scholar 

  12. Hunter RF, Christian H, Veitch J, Astell-Burt T, Hipp JA, Schipperijn J. The impact of interventions to promote physical activity in urban green space: A systematic review and recommendations for future research. Soc Sci Med. 2015;124:246–56. https://doi.org/10.1016/J.SOCSCIMED.2014.11.051.

    Article  PubMed  Google Scholar 

  13. Kelly P, Fitzsimons C, Baker G. Should we reframe how we think about physical activity and sedentary behaviour measurement? Validity and reliability reconsidered. Int J Behav Nutr Phys Act. 2016;13:32. https://doi.org/10.1186/s12966-016-0351-4.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Koo TK, Li MY. A Guideline of Selecting and Reporting Intraclass Correlation Coefficients for Reliability Research. J Chiropr Med. 2016;15:155–63. https://doi.org/10.1016/J.JCM.2016.02.012.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Lee K, Sener IN. Strava Metro data for bicycle monitoring: a literature review. Transp Rev. 2021;41:27–47. https://doi.org/10.1080/01441647.2020.1798558.

    Article  Google Scholar 

  16. Lindsey G, Petesch M, Hankey S. The Minnesota Bicycle and Pedestrian Counting Initiative: Implementation Study. St. Paul, Minnesota: Department for Transportation; 2015.

  17. Lindsey G, Nguyen DBL. Use of Greenway Trails in Indiana Land use planning in Vietnam and land use conversion View project Use of Greenway Trails in Indiana. J Urban Plan Dev. 2004;130:4. https://doi.org/10.1061/(ASCE)0733-9488(2004)130:4(213).

    Article  Google Scholar 

  18. Madden K, Ramsey E, Loane S, Condell J. Trailgazers: A Scoping Study of Footfall Sensors to Aid Tourist Trail Management in Ireland and Other Atlantic Areas of Europe. Sensors. 2021;21:2038. https://doi.org/10.3390/s21062038.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Met Office. Northampton Moulton Park (Northamptonshire) UK climate averages - Met Office. 2022. https://www.metoffice.gov.uk/research/climate/maps-and-data/uk-climate-averages/gcr37upbm. Accessed 9 Aug 2022.

    Google Scholar 

  20. Michie S, van Stralen MM, West R. The behaviour change wheel: A new method for characterising and designing behaviour change interventions. Implement Sci. 2011;6:42. https://doi.org/10.1186/1748-5908-6-42.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Milat AJ, Stubbs J, Engelhard S, Weston P, Fitzgerald S, Giles-Corti B. Measuring physical activity in public open space - An electronic device versus direct observation. Aust N Z J Public Health. 2002;26:50–1. https://doi.org/10.1111/j.1467-842X.2002.tb00270.x.

    Article  PubMed  Google Scholar 

  22. mySociety. Index of Multiple Deprivation 2019 - Northampton Lower Super Output Areas. 2021. https://research.mysociety.org/sites/imd2019/area/la-northampton-borough-council/lsoa/. Accessed 18 May 2022.

    Google Scholar 

  23. Natural England, 2020 - Natural England. Monitoring Engagement in the Natural Environment Survey (2009 - 2019). The Upper Tier Local Authority Dashboard 2020. https://defra.maps.arcgis.com/apps/MapSeries/index.html?appid=2f24d6c942d44e81821c3ed2d4ab2ada. Accessed 19 May 2022.

  24. Natural England. Introduction to the Green Infrastructure Framework - Principles and Standards for England. 2023. https://designatedsites.naturalengland.org.uk/GreenInfrastructure/Home.aspx. Accessed 3 Feb 2023.

    Google Scholar 

  25. Office for National Statistics, 2021 - Office for National Statistics. Exploring local income deprivation 2021. https://www.ons.gov.uk/visualisations/dvc1371/#/E07000154. Accessed 18 May 2022.

  26. Pettebone D, Newman P, Lawson SR. Estimating visitor use at attraction sites and trailheads in Yosemite National Park using automated visitor counters. Landsc Urban Plan. 2010;97:229–38. https://doi.org/10.1016/J.LANDURBPLAN.2010.06.006.

    Article  Google Scholar 

  27. Power D, Lambe B, Murphy N. Trends in recreational walking trail usage in Ireland during the COVID-19 pandemic: Implications for practice. JORT 2023;41:100477. https://doi.org/10.1016/J.JORT.2021.100477.

  28. Public Health England. Improving access to greenspace. A new review for 2020. 2020.

    Google Scholar 

  29. Ryan DJ. Does the creation of a walking loop using directional wayfinding signage increase the physical activity of country park visitors? A natural experiment. OSF 2021. https://doi.org/10.17605/OSF.IO/PGE72.

  30. Ryan D. J, Hardwicke J, Kay AD. Evaluation report – Delapré Cycling and Walking Social Prescription – Baseline Phase. Northampton, England: University of Northampton; 2022.

  31. Terwee CB, Mokkink LB, Hidding LM, Altenburg TM, van Poppel MN, Chinapaw MJM. Comment on “Should we reframe how we think about physical activity and sedentary behavior measurement? Validity and reliability reconsidered.” Int J Behav Nutr Phys Act. 2016;13:66. https://doi.org/10.1186/s12966-016-0392-8.

    Article  PubMed  PubMed Central  Google Scholar 

  32. The Institute for Government, 2022 - The Institute for Government. Timeline of UK government coronavirus lockdowns and restrictions 2022. https://www.instituteforgovernment.org.uk/charts/uk-government-coronavirus-lockdowns. Accessed 27 Jul 2022.

  33. Tucker P, Gilliland J. The effect of season and weather on physical activity: A systematic review. Public Health. 2007;121:909–22. https://doi.org/10.1016/j.puhe.2007.04.009.

    Article  CAS  PubMed  Google Scholar 

  34. UK Government. New executive agency Active Travel England launches - GOV.UK. 2022. https://www.gov.uk/government/speeches/new-executive-agency-active-travel-england-launches. Accessed 17 May 2022.

  35. UK Government. Green social prescribing: call for expressions of interest - GOV.UK. 2021. https://www.gov.uk/government/publications/green-social-prescribing-call-for-expressions-of-interest/green-social-prescribing-call-for-expressions-of-interest. Accessed 17 May 2022.

  36. Veitch J, Salmon J, Carver A, Timperio A, Crawford D, Fletcher E, et al. A natural experiment to examine the impact of park renewal on park-use and park-based physical activity in a disadvantaged neighbourhood: The REVAMP study methods. BMC Public Health. 2014;14:1–9. https://doi.org/10.1186/1471-2458-14-600/TABLES/1.

    Article  Google Scholar 

  37. Weather Underground. Personal Weather Station Dashboard Northampton - INORTHAM95. 2022. https://www.wunderground.com/dashboard/pws/INORTHAM95/graph/2021-06-28/2021-06-28/daily. Accessed 19 Jul 2022.

Download references

Acknowledgements

The authors would like to acknowledge the significant contributions of Richard Clinton (Delapré Abbey Preservation Trust), Jackie Browne (Northamptonshire Sport), Peter Hackett (West Northamptonshire Council), and Peter Boddington (Delapré Abbey Preservation Trust) in the design and delivery of the project.

Funding

DJR was awarded QR seedcorn funding by the University of Northampton. JB is funded by a Wellcome Trust ISSF Fellowship (204796). For the purpose of open access, the authors have applied a Creative Commons Attribution (CC BY) licence to any Author Accepted Manuscript version arising.

Author information

Authors and Affiliations

Authors

Contributions

DJR led the conceptualisation and design of the study. JB provided input into the design of the study. DJR conducted all data collection. JB conducted the behavioural stability analyses and DJR conducted the validity and weather analyses. DJR drafted the manuscript and JB contributed to the revision of the manuscript. Both authors read and approved this final version of the manuscript.

Corresponding author

Correspondence to D. J. Ryan.

Ethics declarations

Ethics approval and consent to participate

Ethical approval was granted by the University of Northampton Faculty Ethics Committee (approval code: 202102). The need for informed consent to participate in the study was waived by the ethics committee.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ryan, D.J., Benton, J.S. Using automated active infrared counters to estimate footfall on urban park footpaths: behavioural stability and validity testing. Int J Behav Nutr Phys Act 20, 49 (2023). https://doi.org/10.1186/s12966-023-01438-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12966-023-01438-w

Keywords