Skip to main content

Composite activity type and stride-specific energy expenditure estimation model for thigh-worn accelerometry

Abstract

Background

Accurately measuring energy expenditure during physical activity outside of the laboratory is challenging, especially on a large scale. Thigh-worn accelerometers have gained popularity due to the possibility to accurately detect physical activity types. The use of machine learning techniques for activity classification and energy expenditure prediction may improve accuracy over current methods. Here, we developed a novel composite energy expenditure estimation model by combining an activity classification model with a stride specific energy expenditure model for walking, running, and cycling.

Methods

We first trained a supervised deep learning activity classification model using pooled data from available adult accelerometer datasets. The composite energy expenditure model was then developed and validated using additional data based on a sample of 69 healthy adult participants (49% female; age = 25.2 ± 5.8 years) who completed a standardised activity protocol with indirect calorimetry as the reference measure.

Results

The activity classification model showed an overall accuracy of 99.7% across all five activity types during validation. The composite model for estimating energy expenditure achieved a mean absolute percentage error of 10.9%. For running, walking, and cycling, the composite model achieved a mean absolute percentage error of 6.6%, 7.9% and 16.1%, respectively.

Conclusions

The integration of thigh-worn accelerometers with machine learning models provides a highly accurate method for classifying physical activity types and estimating energy expenditure. Our novel composite model approach improves the accuracy of energy expenditure measurements and supports better monitoring and assessment methods in non-laboratory settings.

Background

An individual’s physical behaviour includes physical activity and sedentary behaviour. Both are known to be important and modifiable behavioural factors affecting human health [1, 2]. The underlying dimensions of physical behaviour include intensity, duration, domain and type of activity or posture [3]. An important component of physical activity intensity is energy expenditure. Activity-related energy expenditure is the most variable component of total energy expenditure and can be modified by changes in physical behaviour [4]. Changes in energy expenditure can have a significant impact on energy balance and contribute to successful weight management [5]. Recent evidence from the UK Biobank suggests a strong association between activity-related energy expenditure and the incidence of type 2 diabetes [6]. In addition, accurate measures of energy expenditure can be incorporated into closed-loop insulin delivery systems and continuous glucose monitoring systems to improve glycaemic control in people with diabetes [7]. Thus, more accurate measurement of energy expenditure can help to understand energy balance and develop effective interventions to prevent and manage highly prevalent diseases such as obesity or type 2 diabetes.

While energy expenditure can be accurately estimated in the laboratory using various methods such as indirect calorimetry, it remains challenging in free living conditions [4]. The Doubly Labelled Water method is considered the gold standard for assessing free-living energy expenditure, but it remains costly, requires advanced infrastructure, and therefore cannot be easily scaled for large-scale use in research and medical settings.

Wearable devices such as accelerometers and smartwatches are now widely used to assess physical activity and can potentially be used to estimate free-living energy expenditure based on acceleration parameters or heart rate. However, existing wearable systems and estimation methods have shown limited accuracy in estimating energy expenditure [8,9,10]. Recent advances in the accuracy of accelerometer-based energy expenditure estimation have been made by using machine learning-based algorithms to classify activity types and combining these with activity-specific estimation models [11, 12]. Previous research also suggests that stride-segmented estimation methods based on lower body kinematics can be used for more accurate estimation of energy expenditure [13]. Most of these novel approaches have been developed and applied to hip- and wrist-worn accelerometers or require the use of multiple sensor placements.

Thigh-worn accelerometers are increasingly being used in large epidemiological studies due to the ability to accurately classify a range of activity types and postures using data-driven algorithms and raw acceleration signals [3, 14, 15]. A variety of machine learning techniques have been applied to thigh-worn accelerometer data, mainly focusing on activity type and posture classification [16,17,18,19] and intensity categories [20]. To date, a small number of studies have attempted to estimate energy expenditure from thigh-worn accelerometry, primarily using simple regression techniques on aggregated acceleration metrics or cadence [10, 21,22,23].

The recent success of more advanced energy expenditure estimation approaches, coupled with the ability to accurately classify activity type and available information on thigh kinematics, suggests the potential for more accurate energy expenditure estimation using the raw acceleration signal from a thigh-worn sensor. The aim of this study was to train a composite machine learning model to estimate energy expenditure from raw thigh accelerometer data by applying activity classification and stride segmentation to the raw accelerometer data, and to validate the model’s performance against indirect calorimetry.

Methods

We developed a composite activity type and stride-specific energy expenditure estimation (CATSE3) model to enable energy expenditure estimation based on raw 3D acceleration. The multi-model approach involves several key steps and algorithms which are outlined in Fig. 1:

  1. 1.

    Pre-processing: The raw acceleration signal is resampled, auto-calibrated and low-pass filtered using an eighth-order Butterworth filter.

  2. 2.

    Activity classification: A classification algorithm identifies the activity type based on 4-second epochs of time-series 3D acceleration data.

  3. 3.

    Stride segmentation: Individual strides are identified within sequences of walking, running, and cycling. The 3D acceleration data is segmented into individual strides.

  4. 4.

    Energy expenditure estimation: Depending on the predicted activity type, energy expenditure is estimated based on individual strides (for walking, running, and cycling) or aggregated acceleration metrics (for sitting and standing).

Fig. 1
figure 1

Processing steps comprised in the CATSE3 model. Raw triaxial acceleration data is required as input and a time-series with energy expenditure predictions is the resulting output

Sample population

The study included data collected from 69 healthy adults divided into two independent samples. Data from the training sample (n = 49; 45% female; age = 26.0 ± 6.3 years; BMI = 23.0 ± 2.6 kg/m²) were used to develop the processing pipeline and train the energy expenditure models. Data from the test sample (n = 20; 60% female; age = 23.2 ± 3.8 years; BMI = 23.4 ± 1.8 kg/m²) were used only to evaluate the developed approach.

Participants were recruited via newsletters, social media, and posters at the university campus. Participants had no known disability and were free to engage in physical activity. Each participant provided written informed consent prior to participation in the study, which was approved by the Ethics Committee of the German Sport University Cologne (ref. 142/2023). Participants were financially compensated for their time.

Instrumentation

Triaxial acceleration was sampled at 100 Hz with a dynamic range of ± 8 g using an inertial measurement unit (AX6, Axivity Ltd, Newcastle upon Tyne, UK). Prior to each session, a calibration routine was performed in which the sensor was mounted inside a rectangular plastic cube. The cube was then placed on each side for 10 s to establish reference values for each axis. The device was attached directly to the skin on the lateral side of the participant’s dominant thigh, midway between the hip and knee. The device was first attached to a skin-friendly adhesive tape (Fixomull stretch; BSN medical GmbH, Hamburg, Germany) using double-sided adhesive tape, which was then placed on the skin. The device was secured with an elastic bandage (Peha-haft Color; Paul Hartmann AG, Heidenheim, Germany) wrapped around the thigh.

Walking and running were performed on an instrumented treadmill (quasar med; h/p/cosmos sports & medical GmbH, Nussdorf-Traunstein, Germany) and cycling was performed on a stationary bicycle ergometer (ergoselect 100; ergoline GmbH, Bitz, Germany). Indirect calorimetry was used to estimate reference energy expenditure via gas exchange using a metabolic cart (Metalyzer 3B, Cortex Biophysik GmbH, Leipzig, Germany). The metabolic cart was calibrated prior to each session following manufacturer instructions.

Activity protocol

Each subject was asked to complete a standardised activity protocol including sitting, standing, walking, running, and cycling. The protocol varied slightly between the training and validation samples to introduce additional variability (Table 1). It was designed to include a range of basic activities and intensities while adhering to best practice recommendations [24]. Participants were allowed to skip any activity condition that they were unable to complete. All treadmill walking and running conditions were performed at a 1% incline to account for reduced energy expenditure, unless otherwise stated [24]. For all cycling conditions, revolutions per minute (rpm) were chosen ad libitum between 60 and 80 rpm. Each activity condition lasted 6 min. For the walking, running, and cycling conditions, participants had a 5-min rest period between each condition.

Table 1 Activity protocol for the training and validation group

Data processing

Oxygen uptake and carbon dioxide output were measured on a breath-by-breath basis throughout the study and converted to energy expenditure using Weir’s equation [25]. Steady state energy expenditure was calculated by averaging all breaths over the last three minutes of each condition and expressed as kilocalories per minute per kilogram of body weight.

Raw acceleration data were downloaded from the sensors using OMGUI v1.0.0.30 [26] and saved as comma-separated values (CSV) files. The raw data were pre-processed using the actipy package v3.0.5 [27]. First, the acceleration data were low-pass filtered using an 8th order Butterworth filter with a cut-off frequency of 20 Hz. The data were then auto-calibrated against local gravity using a validated approach [28] and resampled to ensure a sampling rate of 100 Hz. The acceleration data were then partitioned into non-overlapping 4-second intervals.

Classification of activity types

We trained a deep learning-based activity classification model to predict the activity type based on 4-second intervals of 3D acceleration data. To train the classification model, we used pooled data from three existing accelerometer datasets, totalling n = 69 healthy adult participants, independent of the data collected in this study. The characteristics of each dataset and the training process are detailed in Additional File 1. The pooled dataset included various laboratory and free-living physical activities, including walking, running, standing, sitting, lying down, and cycling. Reference labels for the activity type were obtained either from direct observation or from video annotation.

A supervised machine learning approach was used, where the model was trained on labelled data (i.e. 3D acceleration data and corresponding activity type label). During the training process, the model parameters are iteratively adjusted to minimise a given loss function corresponding to the overall classification error. We chose a hybrid Convolutional Neural Network (CNN) and Bidirectional Long-Short Term Memory (BiLSTM) model architecture to predict the probability of each activity class for a given 4-second interval. The hybrid architecture allows the deep learning model to extract spatial features based on the CNN layers while also learning temporal patterns using the BiLSTM layer. Previous research has successfully applied a hybrid CNN-BiLSTM approach to classify sedentary behaviour from hip-worn accelerometer data [29]. Model training and optimisation was performed using Keras [30] and KerasTuner [31]. We chose a window size of 4 s based on previous findings [16] and the time required to capture at least one full stride during slow walking.

Segmentation of strides

For each continuous sequence (i.e. one or more 4-second segments) of walking, running, and cycling, the individual strides were identified and segmented using an activity-specific approach. First, the acceleration signal is low-pass filtered using a fourth order Butterworth filter. For running and cycling sequences, the acceleration data collected along the x-axis (i.e. the vertical axis of the thigh when standing still) is low-pass filtered with a cut-off frequency of 5 Hz. Similarly, for walking, the acceleration along the z-axis (sagittal axis) is low-pass filtered with a cut-off frequency of 5 Hz. After filtering, each of these signals is passed to a peak detection algorithm which identifies strides as peaks in the respective acceleration signal. In a final step, the acceleration data of each stride is discretised into an equal number of samples by splitting it into 30 bins using Fast Fourier Transform [13].

Estimation of energy expenditure

A hybrid temporal convolutional network (TCN) deep learning model was trained to predict energy expenditure during running, walking, and cycling using the 3D acceleration of a stride and activity type as model inputs. The TCN model architecture [32] allows learning from sequential data and has previously been used to successfully estimate gait events based on raw time-series data from a single inertial measurement unit (IMU) worn on the shank [33]. We used the KerasTCN implementation [34] for model training. The model architecture and training procedures are described in more detail in Additional File 2.

In addition, we developed a number of different linear regression models to estimate energy expenditure based on the Euclidean Norm Minus One (ENMO) and the Mean Amplitude Deviation (MAD), which were calculated over each 4-second epoch. For baseline comparison, two simple linear regression models were fitted to the training set data to predict energy expenditure as the dependent variable using either ENMO or MAD. Two additional models were fitted by further including the activity type and its interaction effect with the acceleration metric (i.e. ENMO or MAD). These activity-specific linear regression models allowed the use of aggregated acceleration metrics to be compared with the use of more detailed stride-specific acceleration signals in the hybrid TCN model. The CATSE3 approach integrates the hybrid TCN model for walking, running, and cycling sequences and the ENMO-based activity-specific regression model for sitting and standing sequences.

Statistical analysis

The classification performance for the trained CNN-BiLSTM activity classification model as well as the accuracy of the overall energy expenditure model was evaluated on the test dataset to provide robust estimates based on unseen data.

For the classification, the metrics used for evaluation included overall accuracy, recall, precision, and F1 score. The overall accuracy was calculated as the number of true positive predictions divided by the total number of samples. Recall was calculated as the number of true positives divided by the sum of true positives and false negatives. Precision was calculated as the number of true positives divided by the sum of true and false positives. The F1 score was calculated as the harmonic mean of recall and precision.

The metrics used to evaluate the accuracy of the energy expenditure estimation include the root mean squared error (RMSE), the mean absolute percentage error (MAPE), the coefficient of determination (R²), bias and 95% limits of agreement.

All data processing, model training and analysis were performed using Python v.3.11 [35]. The code used for analysis and the resulting models are available in the public Zenodo repository (https://doi.org/10.5281/zenodo.13477127).

Results

Activity classification

The activity classification model achieved an overall accuracy of 99.7% with F1 scores > 0.99 across all five activity types when applied on the test set (n = 23,040 activity samples). Sensitivity ranges from 0.99 for running to 1.00 for walking and sitting (Fig. 2).

Fig. 2
figure 2

Confusion matrix with the ground truth activity classes and classes predicted by the CNN-BiLSTM classification model for the test set (n = 20 participants; 23,040 samples). Values presented are row percentages with the number of samples in brackets

Energy expenditure estimation

The simple linear regression model using the ENMO metric showed the highest estimation error with a RMSE of 0.023 kcal/kg/min and MAPE of 41.5%. The stride and activity type-based algorithm (CATSE3) achieved the lowest RMSE of 0.013 kcal/kg/min and MAPE of 10.9% (Table 2). The R² values ranged from 0.828 for the simple ENMO model to 0.945 for the CATSE3 algorithm. The prediction of the CATSE3 model showed a negative bias close to zero (< 0.001 kcal/kg/min), whereas the predictions of all other models showed a negative bias of ≥ 0.005 kcal/kg/min (Fig. 3).

Across all walking conditions, the CATSE3 model achieved a MAPE of 6.6%, while the regression models including the activity type achieved MAPEs of 15.0% (ENMO + activity type) and 14.5% (MAD + activity type). MAPE values for running ranged from 7.9% (CATSE3 and MAD + activity type) to 10.5% (ENMO), while for cycling they ranged from 16.1% (CATSE3) to 29.6% (MAD + activity type). Post-hoc Pearson correlation analysis showed negative correlations between the participants height (r = -0.69) and weight (r = -0.47) with the prediction error in cycling.

Table 2 Error metrics for the energy expenditure estimation models. RMSE = root mean squared error; MAPE = mean absolute percentage error; R² = coefficient of determination; ENMO = euclidean norm Minus one; MAD = Mean Amplitude deviation
Fig. 3
figure 3

Scatterplots showing the relationship between the model energy expenditure predictions and the reference energy expenditure

Discussion

Our study aimed to develop and test a new approach for estimating energy expenditure based on raw 3D acceleration data from wearable accelerometers attached to the thigh. We successfully trained a deep learning algorithm on a pooled accelerometer dataset to classify basic activity types based on raw 3D acceleration data. By combining the activity classification algorithm with a stride-specific energy expenditure estimation approach, the CATSE3 model achieved a MAPE of 10.9% across different activities and intensities. In comparison, the activity-specific linear regression models showed a higher relative error (≥ 16.4%) and the simple linear regression models without the activity type showed more than twice the relative error (≥ 24.2%). These findings are consistent with results from previous research using a stride-specific model based on input data from two IMUs worn on the thigh and the shank [13]. The information contained in the raw 3D acceleration data of each stride cycle allows the CATSE3 model to distinguish between flat and inclined walking and running. This is probably due to small changes in the 3D acceleration curve during inclined gait. In comparison, the models based on the time-aggregated ENMO and MAD acceleration metrics do not appear to capture the increased energy expenditure associated with inclined walking and running (Fig. 4).

A more recent model based on wrist-worn IMU data combined an activity classification algorithm with walking speed estimation to predict energy expenditure during walking, sitting, and standing with an overall MAPE of 15% [11]. Previous research proposed a simple modelling equation based on the triaxial vector magnitude of a thigh-worn sensor to estimate energy expenditure during walking [22]. However, the model showed a MAPE of 18% over a range of walking speeds, whereas the CATSE3 model in our study showed a substantially lower MAPE of 6.6%. Importantly, the estimates of energy expenditure during cycling (MAPE = 16.1%) were not as accurate as during walking and running (7.9%). It appears that the model does not capture variations in energy expenditure during cycling very well (Fig. 3), which is consistent with previous research using hip-worn accelerometer data [12].

Fig. 4
figure 4

Predicted energy expenditure for different participants as well as walking, running, and cycling conditions

Strengths and limitations

The activity protocol for this study followed recent guidelines for wearable energy expenditure validation studies [24]. The combination of multiple activity types, intensities, and flat and inclined treadmill activities allowed all models to be trained and evaluated over a wide range of activity conditions. However, additional activities such as stair climbing, or downhill walking and running were not included in this study.

It is important to recognise that the CATSE3 model and the mapping of energy expenditure to the stride acceleration time series are based on data from treadmill walking and running as well as stationary cycling. However, there may be differences in the stride acceleration profiles of the thigh in free-living conditions, that may affect the transferability of our results. Previous research suggests that the biomechanics of walking and running on a treadmill are largely comparable to those on the ground, but there are minor differences in spatiotemporal (i.e. shorter stride length and higher cadence) and sagittal plane kinematics (increased hip and knee flexion angles) [36,37,38]. The impact of these differences on thigh acceleration curves is currently unknown, but warrants caution and further investigation.

Our study benefited from the integration of different datasets for model training and evaluation. We trained our activity classification model on an independent dataset that included data from several existing studies in different contexts, including laboratory and free-living activities. Therefore, we expect the classification model to be reasonably robust, as evidenced by the high overall classification accuracy both during model development (99.3% for the hold-out set) and on the test dataset (99.7%). However, the classification performance may not be indicative of actual performance in real-world conditions and needs to be further evaluated, ideally using an independent dataset with free-living activities and a more heterogeneous sample. In general, our sample for this study is considered homogeneous in terms of BMI, age, and health status. Therefore, the generalisability of our results to other populations, such as children, older adults, or people with gait disorders, may not be given.

Future investigations

It is currently unknown how the model performs for undefined activities and intensities outside the range included in the study (e.g. very fast walking). The inclusion of additional activity types (e.g. climbing stairs, lying in bed) and a wider range of walking and running speeds is likely to improve estimation accuracy. Future iterations of the model should therefore consider extending the range of activity types and account for unknown activity types.

Another way to improve the accuracy of the model may be to include body weight and height as additional input variables, as both are related to energy expenditure and the magnitude of the prediction error [22]. In addition, the estimation of energy expenditure during sitting and standing may be improved by integrating an estimate of the basal metabolic rate rather than including only an aggregated acceleration metric [13].

Conclusions

To our knowledge, the CATSE3 model is the first energy expenditure model that uses both activity classification and stride segmentation to estimate energy expenditure based solely on thigh acceleration data. With this study, we provide the physical activity research community with several open-source energy expenditure estimation algorithms for thigh-worn accelerometers of varying complexity.

Our results show that stride specific information contained in the 3D acceleration signal can be used in combination with activity type information to achieve a lower estimation error when compared to simpler estimation approaches currently used in physical activity research. Energy expenditure estimates based on activity and stride-specific information appear to be more accurate than approaches using aggregated acceleration metrics such as ENMO or MAD. The integration of available and validated activity classification algorithms should be encouraged in future research on energy expenditure estimation methods using thigh-worn accelerometry.

Data availability

The code and anonymised data necessary to reproduce the results of this study are available from the Zenodo repository (https://doi.org/10.5281/zenodo.13477127). The data includes raw data and custom Python code necessary to reproduce the results, tables, and figures of this study. The data used to train the activity classification algorithm can be requested from the respective authors.

Abbreviations

CATSE3:

Composite Activity Type and Stride-specific Energy Expenditure Estimation

CNN-BiLSTM:

Convolutional Neural Network - Bidirectional Long-Short Term Memory

ENMO:

Euclidean Norm Minus One

MAD:

Mean Amplitude Deviation

MAPE:

Mean Absolute Percentage Error

RMSE:

Root Mean Squared Error

TCN:

Temporal Convolutional Network

References

  1. Dunstan DW, Dogra S, Carter SE, Owen N. Sit less and move more for cardiovascular health: emerging insights and opportunities. Nat Rev Cardiol. 2021;18:637–48.

    Article  PubMed  Google Scholar 

  2. Saunders TJ, McIsaac T, Douillette K, Gaulton N, Hunter S, Rhodes RE, et al. Sedentary behaviour and health in adults: an overview of systematic reviews. Appl Physiol Nutr Metab. 2020;45:S197–217.

    Article  PubMed  Google Scholar 

  3. Stevens ML, Gupta N, Inan Eroglu E, Crowley PJ, Eroglu B, Bauman A, et al. Thigh-worn accelerometry for measuring movement and posture across the 24-hour cycle: a scoping review and expert statement. BMJ Open Sport Exerc Med. 2020;6:e000874.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Hills AP, Mokhtar N, Byrne NM. Assessment of physical activity and energy expenditure: an overview of objective measures. Front Nutr. 2014;1.

  5. Varkevisser RDM, van Stralen MM, Kroeze W, Ket JCF, Steenhuis IHM. Determinants of weight loss maintenance: a systematic review. Obes Rev. 2019;20:171–211.

    Article  CAS  PubMed  Google Scholar 

  6. Strain T, Dempsey PC, Wijndaele K, Sharp SJ, Kerrison N, Gonzales TI, et al. Quantifying the Relationship between Physical Activity Energy expenditure and incident type 2 diabetes: a prospective cohort study of device-measured activity in 90,096 adults. Diabetes Care. 2023;46:1145–55.

    Article  PubMed  PubMed Central  Google Scholar 

  7. van Doorn WPTM, Foreman YD, Schaper NC, Savelberg HHCM, Koster A, van der Kallen CJH, et al. Machine learning-based glucose prediction with use of continuous glucose and physical activity monitoring data: the Maastricht Study. PLoS ONE. 2021;16:e0253125.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Düking P, Giessing L, Frenkel MO, Koehler K, Holmberg H-C, Sperlich B. Wrist-worn wearables for Monitoring Heart Rate and Energy Expenditure while sitting or performing light-to-vigorous physical activity: Validation Study. JMIR MHealth UHealth. 2020;8:e16716.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Hajj-Boutros G, Landry-Duval M-A, Comtois AS, Gouspillou G, Karelis AD. Wrist-worn devices for the measurement of heart rate and energy expenditure: a validation study for the Apple Watch 6, Polar Vantage V and Fitbit sense. Eur J Sport Sci. 2022;1–13.

  10. Stenbäck V, Leppäluoto J, Leskelä N, Viitala L, Vihriälä E, Gagnon D, et al. Step detection and energy expenditure at different speeds by three accelerometers in a controlled environment. Sci Rep. 2021;11:20005.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Aziz O, Zihajehzadeh S, Park A, Tae C-G, Park EJ. Improving Energy Expenditure Estimation through activity classification and walking speed estimation using a Smartwatch. 2020 42nd Annu Int conf IEEE Eng Med Biol soc EMBC. 2020. p. 3940–4.

  12. Garnotel M, Bastian T, Romero-Ugalde HM, Maire A, Dugas J, Zahariev A, et al. Prior automatic posture and activity identification improves physical activity energy expenditure prediction from hip-worn triaxial accelerometry. J Appl Physiol. 2018;124:780–90.

    Article  CAS  PubMed  Google Scholar 

  13. Slade P, Kochenderfer MJ, Delp SL, Collins SH. Sensing leg movement enhances wearable monitoring of energy expenditure. Nat Commun. 2021;12:4312.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  14. Stamatakis E, Koster A, Hamer M, Rangul V, Lee I-M, Bauman AE, et al. Emerging collaborative research platforms for the next generation of physical activity, sleep and exercise medicine guidelines: the prospective physical activity, sitting, and Sleep consortium (ProPASS). Br J Sports Med. 2020;54:435–7.

    Article  PubMed  Google Scholar 

  15. Crowley P, Skotte J, Stamatakis E, Hamer M, Aadahl M, Stevens ML, et al. Comparison of physical behavior estimates from three different thigh-worn accelerometers brands: a proof-of-concept for the prospective physical activity, sitting, and Sleep consortium (ProPASS). Int J Behav Nutr Phys Act. 2019;16:65.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Bach K, Kongsvold A, Bårdstu H, Bardal EM, Kjærnli HS, Herland S, et al. A machine learning classifier for detection of physical activity types and postures during free-living. J Meas Phys Behav. 2022;5:24–31.

    Article  Google Scholar 

  17. Stewart T, Narayanan A, Hedayatrad L, Neville J, Mackay L, Duncan S. A dual-accelerometer system for classifying physical activity in children and adults. Med Sci Sports Exerc. 2018;50:2595–602.

    Article  PubMed  Google Scholar 

  18. Logacjov A, Bach K. Self-supervised learning with randomized cross-sensor masked reconstruction for human activity recognition. Eng Appl Artif Intell. 2024;128:107478.

    Article  Google Scholar 

  19. Skovbjerg F, Honoré H, Mechlenburg I, Lipperts M, Gade R, Næss-Schmidt ET. Monitoring physical behavior in Rehabilitation using a machine learning–based algorithm for thigh-mounted accelerometers: Development and Validation Study. JMIR Bioinforma Biotechnol. 2022;3:e38512.

    Article  Google Scholar 

  20. Wullems JA, Verschueren SMP, Degens H, Morse CI, Onambélé GL. Performance of thigh-mounted triaxial accelerometer algorithms in objective quantification of sedentary behaviour and physical activity in older adults. PLoS ONE. 2017;12:e0188215.

    Article  PubMed  PubMed Central  Google Scholar 

  21. White T, Westgate K, Hollidge S, Venables M, Olivier P, Wareham N, et al. Estimating energy expenditure from wrist and thigh accelerometry in free-living adults: a doubly labelled water study. Int J Obes. 2019;43:2333–42.

    Article  Google Scholar 

  22. O’Brien MW, Wu Y, Johns JA, Poitras J, Kimmerly DS. Development and validation of an activPAL accelerometry count-based model of physical activity intensity in adults. Med Eng Phys. 2021;95:45–50.

    Article  PubMed  Google Scholar 

  23. Montoye AHK, Begum M, Henning Z, Pfeiffer KA. Comparison of linear and non-linear models for predicting energy expenditure from raw accelerometer data. Physiol Meas. 2017;38:343–57.

    Article  PubMed  Google Scholar 

  24. Argent R, Hetherington-Rauth M, Stang J, Tarp J, Ortega FB, Molina-Garcia P, et al. Recommendations for determining the validity of consumer wearables and smartphones for the Estimation of Energy Expenditure: Expert Statement and Checklist of the INTERLIVE Network. Sports Med. 2022;52:1817–32.

    Article  PubMed  PubMed Central  Google Scholar 

  25. de Weir JB. V. New methods for calculating metabolic rate with special reference to protein metabolism. J Physiol. 1949;109:1–9.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Open Lab OMGUI. 2022. https://github.com/digitalinteraction/openmovement/wiki/AX3-GUI

  27. Chan S, Doherty A, Actipy. 2024. https://github.com/OxWearables/actipy

  28. Van Hees VT, Fang Z, Langford J, Assah F, Mohammad A, Da Silva ICM, et al. Autocalibration of accelerometer data for free-living physical activity assessment using local gravity and temperature: an evaluation on four continents. J Appl Physiol. 2014;117:738–44.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Bellettiere J, Nakandala S, Tuz-Zahra F, Winkler EAH, Hibbing PR, Healy GN, et al. CHAP-Adult: A Reliable and Valid Algorithm to classify sitting and measure sitting patterns using data from hip-worn accelerometers in adults aged 35+. J Meas Phys Behav. 2022;5:215–23.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Chollet F. others. Keras. 2015. Available from: https://keras.io.

  31. O’Malley T, Bursztein E, Long J, Chollet F, Jin H, Invernizzi L et al. KerasTuner. 2019. https://github.com/keras-team/keras-tuner

  32. Bai S, Kolter JZ, Koltun V. An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling. 2018 [Accessed 18 Jan 2024]; https://arxiv.org/abs/1803.01271

  33. Romijnders R, Warmerdam E, Hansen C, Schmidt G, Maetzler W. A Deep Learning Approach for Gait Event detection from a single Shank-worn IMU: validation in healthy and neurological cohorts. Sensors. 2022;22:3859.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Rémy P. Temporal Convolutional Networks for Keras. GitHub Repos. GitHub; 2020. https://github.com/philipperemy/keras-tcn

  35. Python Software Foundation. Python. 2022. https://www.python.org/

  36. Strutzenberger G, Leutgeb L, Claußen L, Schwameder H. Gait on slopes: differences in temporo-spatial, kinematic and kinetic gait parameters between walking on a ramp and on a treadmill. Gait Posture. 2022;91:73–8.

    Article  PubMed  Google Scholar 

  37. Vickery-Howe DM, Bonanno DR, Dascombe BJ, Drain JR, Clarke AC, Hoolihan B, et al. Physiological, perceptual, and biomechanical differences between treadmill and overground walking in healthy adults: a systematic review and meta-analysis. J Sports Sci. 2023;41:2088–120.

    Article  CAS  PubMed  Google Scholar 

  38. Van Hooren B, Fuller JT, Buckley JD, Miller JR, Sewell K, Rao G, et al. Is Motorized Treadmill running biomechanically comparable to Overground running? A systematic review and Meta-analysis of Cross-over studies. Sports Med. 2020;50:785–813.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

The authors of this study would like to thank all the participants in this study for their time and commitment. The authors would also like to thank the participants and researchers involved in the studies that provided the additional datasets used in this study.

Funding

Open Access funding enabled and organized by Projekt DEAL. This project was funded by the Internal Research Funds of the German Sport University Cologne, grant agreement number L-11-10011-267-121010. Claas Lendt was also supported by a fellowship from the German Academic Exchange Service (DAAD). The funding bodies had no role in the design of the study, collection, analysis and interpretation of the data, or preparation of the manuscript.

Open Access funding enabled and organized by Projekt DEAL.

Author information

Authors and Affiliations

Authors

Contributions

CL and NH collected the data. CL processed the raw data, developed the overall approach, and trained the different models on the given datasets. CL drafted the manuscript. NH, IF and TS provided critical feedback on the manuscript. All authors read and approved the final version of the manuscript.

Corresponding author

Correspondence to Claas Lendt.

Ethics declarations

Ethics approval and consent to participate

Each participant provided written informed consent prior to participation in the study, which was approved by the Ethics Committee of the German Sport University Cologne (reference number 142/2023).

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lendt, C., Hansen, N., Froböse, I. et al. Composite activity type and stride-specific energy expenditure estimation model for thigh-worn accelerometry. Int J Behav Nutr Phys Act 21, 99 (2024). https://doi.org/10.1186/s12966-024-01646-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12966-024-01646-y

Keywords