sensors-25-00333
sensors-25-00333
sensors-25-00333
1 German Center for Vertigo and Balance Disorders (DSGZ), LMU University Hospital, LMU Munich,
81377 Munich, Germany
2 Institut für Notfallmedizin und Medizinmanagement (INM), LMU University Hospital, LMU Munich,
80336 Munich, Germany
3 Schön Klinik Bad Aibling, 83043 Bad Aibling, Germany
4 Department of Neurology, LMU University Hospital, LMU Munich, 81377 Munich, Germany
* Correspondence: max.wuehr@med.uni-muenchen.de
Abstract: Instrumented gait analysis is widely used in clinical settings for the early detec-
tion of neurological disorders, monitoring disease progression, and evaluating fall risk.
However, the gold-standard marker-based 3D motion analysis is limited by high time and
personnel demands. Advances in computer vision now enable markerless whole-body
tracking with high accuracy. Here, we present vGait, a comprehensive 3D gait assessment
method using a single RGB-D sensor and state-of-the-art pose-tracking algorithms. vGait
was validated in healthy participants during frontal- and sagittal-perspective walking.
Performance was comparable across perspectives, with vGait achieving high accuracy in
detecting initial and final foot contacts (F1 scores > 95%) and reliably quantifying spatiotem-
poral gait parameters (e.g., stride time, stride length) and whole-body coordination metrics
(e.g., arm swing and knee angle ROM) at different levels of granularity (mean, step-to-step
variability, side asymmetry). The flexibility, accuracy, and minimal resource requirements of
vGait make it a valuable tool for clinical and non-clinical applications, including outpatient
clinics, medical practices, nursing homes, and community settings. By enabling efficient
and scalable gait assessment, vGait has the potential to enhance diagnostic and therapeutic
workflows and improve access to clinical mobility monitoring.
Keywords: gait analysis; gait disorders; motion tracking; pose tracking; RGB-D sensor
Academic Editor: Winson Lee
Figure 1.
Figure 1. Experimental
Experimental setup. (A)(A)
setup. Participants walked
Participants alongalong
walked a marked figure-eight
a marked path with
figure-eight a di-
path with a
agonal length
diagonal of 5.1
length m, allowing
of 5.1 for both
m, allowing frontal-perspective
for both and sagittal-perspective
frontal-perspective walking. walking.
and sagittal-perspective (B)
(B) A total
A total ofdisplayed
of 17 17 displayed keypoints
keypoints werewere analyzed
analyzed to calculate
to calculate spatiotemporal
spatiotemporal gait parameters.
gait cycle cycle parameters.
cubic spline interpolation. The subsequent analysis for step detection and gait parameter
evaluation was restricted to the sections where both systems could simultaneously observe
the walking participant, which corresponded to the two diagonal segments of the figure-
eight path. The analysis was conducted separately for gait segments in the frontal and
sagittal planes.
Step detection was performed using an established gait event detection algorithm
based on an adaptive threshold method utilizing foot keypoints [27]. Candidate initial
ground contact (IC) time instances were identified as those where the magnitude of the 3D
heel velocity vector became less than 0.5 times the walking speed. For the detection of final
ground contact (FC) time instances, an adaptive threshold of 0.8 times the walking speed
was applied to the 3D toe velocity. The detection algorithm was executed twice: initially
with an estimated walking speed of 1 m/s and subsequently with a refined walking speed
estimate based on the IC times and positions identified in the first iteration.
Based on the results of the step detection, various gait cycle parameters were calculated
to characterize the spatiotemporal step sequence and different additional aspects of whole-
body coordination during walking (Figure 2). The spatiotemporal gait cycle parameters
included stride time (temporal difference between successive ICs of the same foot, [s]),
swing phase (duration of the gait cycle during which only one foot is in contact with the
ground, [s]), and double support phase (duration of the gait cycle during which both feet
are in contact with the ground, [s]). Additionally, stride length (Euclidean distance between
the foot positions at successive ICs of the same foot, [m]) and stride width (perpendicular
distance of one IC foot position to the line connecting two successive IC positions of the
opposite foot, [m]). Furthermore, we calculated the foot progression angle (FPA, angle
between the line of progression, i.e., the line connecting two successive IC positions of the
same foot, and the longitudinal axis of the foot, defined by the line connecting the heel and
toe positions during the stance phase, [◦ ]). The arm swing range of motion (ROM) was
defined as the maximal angular displacement of the line connecting the shoulder and wrist
in the walking direction during the gait cycle [◦ ]. The knee ROM was determined by the
angular difference between the maximum extension and flexion of the knee during the
gait cycle [◦ ]. For all gait parameters, we calculated the mean over all collected gait cycles.
Furthermore, stride-to-stride fluctuations in each parameter were assessed by calculating
Sensors 2025, 25, x FOR PEER REVIEW 5 of 12
the coefficient of variation (CV; 100 × std/mean, [%]), and side asymmetry was computed
using the formula 100 × (1mean(smaller foot value)/mean(larger foot value) [%].
Figure
Figure 2. Definitionofofspatial
2. Definition spatialgait
gait characteristics.
characteristics. (A)(A) Stride
Stride length
length is distance
is the the distance
betweenbetween two suc-
two suc-
cessive
cessive heel contacts of the same foot, while stride width is the perpendicular distance from one heel heel
heel contacts of the same foot, while stride width is the perpendicular distance from one
contact to the
contact to theline
lineconnecting
connecting twotwo successive
successive heel heel contacts
contacts of the of the opposite
opposite foot (i.e.,foot (i.e.,ofthe
the line line of
pro-
progression). The FPA is the angular deviation between the foot midline and the
gression). The FPA is the angular deviation between the foot midline and the line of progression. line of progression.
(B) Arm swing ROM is the maximal angular displacement of the line connecting the shoulder and
wrist in the walking direction within a gait cycle. (C) Knee ROM is defined as the angular difference
between the maximum extension and flexion of the knee during the gait cycle. Exemplary knee joint
angle curves (mean ± SD) are shown from vGait (red line) and the ground truth (gray line). Abbre-
viations: FPA, foot progression angle; ROM, range of motion.
(B) Arm swing ROM is the maximal angular displacement of the line connecting the shoulder and
wrist in the walking direction within a gait cycle. (C) Knee ROM is defined as the angular difference
between the maximum extension and flexion of the knee during the gait cycle. Exemplary knee
joint angle curves (mean ± SD) are shown from vGait (red line) and the ground truth (gray line).
Abbreviations: FPA, foot progression angle; ROM, range of motion.
TP
precision =
TP + FP
precision ∗ recall
F1 score = 2 ×
precision + recall
A detected event (either IC or FC) was considered a TP if the absolute time difference
from the corresponding gold-standard event was <250 ms [28]. Forall TP, the time agree-
ment with the ground truth was quantified by temporal error = abs t gold standard − tvGait .
We employed multiple statistical techniques to assess the agreement of derived tem-
poral and spatial gait cycle parameters with the gold standard, including the absolute and
relative root mean square error (RMSE) and the intraclass correlation coefficient for relative
agreement (ICC(3,1); two-way mixed model). ICC outcomes were interpreted according to
established categories [29]: poor agreement (<0.5), moderate agreement (0.5–0.75), good
agreement (0.75–0.9), and excellent agreement (>0.9). All analyses were conducted using
Python 3.9.
3. Results
3.1. Step Detection Performance
Table 1 and Figure 3 summarize the overall performance of vGait in detecting ICs and
FCs during frontal- and sagittal-perspective walking. Both event types were identified
with high accuracy, achieving F1 scores exceeding 95% for walking captured from both
perspectives. Overall, vGait tended to detect both types of events slightly earlier, with
absolute time errors ranging from 45 to 65 ms.
Table 1. Detection performance and temporal agreement of initial and final foot contacts identified
by vGait compared to the gold standard.
Figure 3. Histograms
Figure 3. illustrating the
Histograms illustrating the temporal
temporalagreement
agreement(t(tgold
gold –t)vGait
–tvGait
standard
standard ) of initial
of initial and foot
and final final foot
contacts identifiedby
contacts identified byvGait
vGaitcompared
compared toto
thethe gold
gold standard
standard duringduring(A) (A) frontal-perspective
frontal-perspective walking
walking
and
and (B)
(B) sagittal-perspective walking.
sagittal-perspective walking.
Table 2. Accuracy statistics of gait cycle parameters derived from vGait during frontal perspective
walking, compared to the gold standard.
Gold
Param. Metric vGait RMSEABS RMSEREL ICC(3,1)
Standard
mean 1.1 ± 0.1 s 1.1 ± 0.1 s 0.1 s 4.6% 0.952
stride CV 3.9 ± 0.6% 6.3 ± 3.1% 3.9% 61.3% 0.812
time asym. 1.6 ± 1.0% 5.3 ± 5.3% 6.4% 120.8% 0.803
mean 0.4 ± 0.0 s 0.5 ± 0.0 s 0.1 s 74.4% 0.784
swing CV 9.2 ± 2.0% 2.4 ± 1.5% 7.3% 75.5% 0.579
time asym. 5.7 ± 4.5% 2.1 ± 2.1% 6.4% 110.0% 0.442
mean 0.2 ± 0.0 s 0.1 ± 0.1 s 0.1 s 74.4% 0.769
dsupp CV 21.4 ± 4.9% 52.7 ± 23.6% 39.7% 75.5% 0.784
time asym. 8.5 ± 7.8% 30.9 ± 24.1% 34.0% 110.0% 0.778
mean 1.4 ± 0.1 m 1.4 ± 0.1 m 0.0 m 3.0% 0.986
stride
CV 3.4 ± 0.5% 5.1 ± 2.0% 2.9% 56.4% 0.738
length
asym. 1.5 ± 0.8% 3.5 ± 3.5% 4.2% 188.5% 0.783
mean 0.2 ± 0.0 m 0.2 ± 0.0 m 0.0 s 11.5% 0.896
base of
CV 27.7 ± 9.9% 26.2 ± 8.3% 5.5% 53.5% 0.850
support
asym. 10.5 ± 8.4% 13.7 ± 9.7% 5.8% 2.6% 0.859
mean 1.2 ± 0.1 m/s 1.2 ± 0.1 m/s 0.0 m/s 2.9% 0.991
velocity CV 5.3 ± 0.6% 3.5 ± 0.9% 2.0% 58.6% 0.787
asym. 2.1 ± 1.9% 1.9 ± 2.1% 2.2% 112.8% 0.795
Sensors 2025, 25, 333 7 of 11
Table 2. Cont.
Gold
Param. Metric vGait RMSEABS RMSEREL ICC(3,1)
Standard
mean 7.5 ± 1.5◦ 5.9 ± 1.6◦ 2.0◦ 33.4% 0.912
FPA CV 25.8 ± 6.9% 28.2 ± 8.8% 9.1% 33.4% 0.805
asym. 12.5 ± 9.0% 20.2 ± 15.5% 19.2% 95.5% 0.754
mean 31.1 ± 8.8◦ 25.2 ± 9.7◦ 7.5◦ 29.8% 0.949
arm swing CV 29.6 ± 14.7% 17.5 ± 5.8% 17.9% 102.3% 0.522
ROM asym. 24.8 ± 13.6% 16.2 ± 13.2% 21.3% 131.4% 0.655
mean 38.2 ± 4.3◦ 40.3 ± 4.4◦ 4.8 s 11.9% 0.817
knee angle CV 10.9 ± 3.5% 6.5 ± 5.0% 5.5% 102.0% 0.806
ROM asym. 5.6 ± 4.1% 7.8 ± 6.6% 5.8% 107.1% 0.730
Abbreviations: dsupp time: double support time; FPA: foot progression angle; ROM: range of motion; CV: co-
efficient of variation; asym.: asymmetry; RMSE: absolute and relative root mean square error; ICC: intraclass
correlation coefficient.
Table 3. Accuracy statistics of gait cycle parameters derived from vGait during sagittal-perspective
walking, compared to the gold standard.
Gold
Param. Metric vGait RMSEABS RMSEREL ICC(3,1)
Standard
mean 1.1 ± 0.1 s 1.1 ± 0.1 s 0.0 s 3.75 % 0.975
stride CV 4.4 ± 1.5% 1.5 ± 1.2% 3.4% 222.2% 0.638
time asym. 5.0 ± 3.4% 0.7 ± 0.9% 5.5% 758.1% 0.335
mean 0.4 ± 0.0 s 0.5 ± 0.0 s 0.1 s 13.6% 0.663
swing CV 10.2 ± 3.4% 2.9 ± 3.0% 8.4% 292.8% 0.689
time asym. 8.7 ± 6.1% 2.6 ± 2.5% 9.6% 368.8% 0.365
mean 0.2 ± 0.1 s 0.1 ± 0.0 s 0.1 s 96.9% 0.741
dsupp CV 30.7 ± 8.5% 18.9 ± 16.7% 20.9% 110.5% 0.790
time asym. 16.5 ± 8.1% 7.6 ± 6.7% 13.0% 172.0% 0.684
mean 1.4 ± 0.1 m 1.4 ± 0.1 m 0.0 m 2.1% 0.996
stride
CV 3.0 ± 1.1% 1.9 ± 0.8% 1.5% 77.3% 0.743
length
asym. 2.1 ± 1.4% 0.7 ± 0.7% 2.2% 334.6% 0.473
mean 0.2 ± 0.1 m 0.2 ± 0.1 m 0.0 m 23.7% 0.959
base of
CV 28.7 ± 14.0% 28.9 ± 18.5% 12.0% 41.4% 0.908
support
asym. 17.2 ± 10.2% 13.5 ± 9.2% 13.9% 102.5% 0.665
mean 1.3 ± 0.1 m/s 1.3 ± 0.1 m/s 0.0 m/s 2.9% 0.988
velocity CV 5.3 ± 2.2% 2.5 ± 1.6% 3.6% 144.8% 0.695
asym. 4.1 ± 3.5% 0.6 ± 1.0% 4.6% 718.2% 0.451
mean 5.6 ± 2.1◦ 5.4 ± 2.1◦ 0.6◦ 10.4% 0.985
FPA CV 30.8 ± 14.0% 27.6 ± 14.7% 9.9% 35.8% 0.913
asym. 34.2 ± 11.0% 19.5 ± 14.7% 15.2% 78.0% 0.810
mean 34.2 ± 11.0◦ 25.3 ± 9.5◦ 12.4◦ 49.3% 0.838
arm swing CV 34.0 ± 8.3% 14.4 ± 8.5% 22.4% 156.3% 0.716
ROM asym. 36.2 ± 16.7% 17.9 ± 12.1% 29.3% 163.9% 0.545
mean 49.1 ± 6.9◦ 38.7 ± 5.1◦ 11.9◦ 30.8% 0.765
knee angle CV 19.6 ± 9.6% 5.7 ± 5.8% 18.8% 330.0% 0.479
ROM asym. 18.2 ± 9.5% 7.2 ± 5.4% 16.6% 232.5% 0.459
Abbreviations: dsupp time: double support time; FPA: foot progression angle; ROM: range of motion; CV: co-
efficient of variation; asym.: asymmetry; RMSE: absolute and relative root mean square error; ICC: intraclass
correlation coefficient.
4. Discussion
In this study, we investigated the reliability of a clinical gait analysis approach using
a single integrated RGB-D sensor (vGait). Our findings demonstrate that this method
enables reliable temporal step detection, as well as the determination of spatiotemporal
gait parameters across various levels of granularity (i.e., mean values, variability, and
Sensors 2025, 25, 333 8 of 11
side asymmetry) and clinically relevant aspects of whole-body gait coordination, with
overall good-to-excellent reliability. This holds true when analyzing gait in a frontal
perspective (i.e., walking in the direction of the camera) and, with only minor compromises,
from a sagittal perspective (i.e., walking sideways across the camera’s field of view), thus
underscoring the spatial flexibility of the vGait approach.
It is now common practice to categorize neurological gait disorders based on their
phenotypic presentations (e.g., ataxic, hypokinetic, dyskinetic) [1–3], and optical solutions
are ideally suited to instrumentally characterize these conditions. Although conventional
marker-based motion capture systems have traditionally represented the gold standard,
they remain time- and resource-intensive. Recent advances in computer vision and optical
hardware have enabled more comprehensive “deep phenotyping” of gait disorders without
markers or complex multi-camera setups. Indeed, a growing body of research now shows
that state-of-the-art pose-tracking solutions can achieve accuracy levels approaching those
of marker-based systems [13–15]. However, achieving such accuracy in 3D gait analysis
has often required multi-camera RGB configurations with extensive calibration and syn-
chronization, limiting their practical utility. Previous efforts to enhance spatial flexibility
through integrated RGB-D sensors were hampered by insufficient depth coverage, exces-
sive noise, and tracking errors, necessitating multiple sensors to reliably cover clinically
meaningful distances [16–23]. With the latest generation of RGB-D sensors offering ex-
tended depth coverage and improved depth mapping, our current study demonstrates that
these constraints have now been overcome, enabling clinically reliable gait phenotyping
using a single, integrated, and spatially versatile sensor system.
Building on these advances, our study demonstrates that clinically reliable gait phe-
notyping is now achievable over meaningful distances using just a single RGB-D sensor
(Kinect Azure). In the sagittal plane, we successfully characterized gait across ~5 m, cap-
turing approximately eight steps or seven complete gait cycles, assuming an average step
length of ~0.6 m. From a frontal perspective, we achieved similarly robust results over
~4.6 m, amounting to about seven steps or six full cycles. This spatial flexibility extends the
applicability of vGait beyond specialized laboratory setups, making it readily deployable
in a variety of clinical and non-clinical environments, such as outpatient clinics, medical
practices, nursing homes, and community centers. Moreover, our approach is not confined
to gait assessment alone; it can be seamlessly adapted to evaluate other clinically rele-
vant parameters, including static and dynamic postural stability [24,30], thereby enriching
clinical evaluations with objective, digital outcome metrics.
Clinical gait assessment commonly focuses on five major domains [31,32]: (1) pace
(e.g., gait speed, stride length), (2) rhythm (e.g., swing and double support phases), (3) vari-
ability (e.g., stride time and stride length variability), (4) asymmetry (e.g., asymmetry of
stride time and stride length), and (5) postural control (e.g., average and variability of
stride width). Our results show that vGait can reliably quantify spatiotemporal parameters
across all these domains, underscoring its potential for the comprehensive and accurate
identification of gait disturbances. This level of precision enables clinicians to monitor
disease progression and therapeutic responses with clinically meaningful accuracy. For
example, the minimal clinically important difference (MCID) for gait speed in adults with
various health conditions, such as multiple sclerosis, acute cardiovascular disease, and
stroke, typically ranges from 10 to 20 cm/s [33,34]. This range is well above vGait’s error
in estimating gait velocity (RMSEABS of about 4 cm/s). Beyond gait speed, changes in
gait variability offer critical insights into fall risk and disease progression in conditions
such as cerebellar gait ataxia and Parkinson’s disease [35,36]. Recent estimates suggest an
MCID of 1.0% for spatial gait variability in patients with Parkinson’s disease [37], which
aligns closely with vGait’s error in estimating stride length variability (RMSEABS = 1.5%).
Sensors 2025, 25, 333 9 of 11
Similarly, the MCID for gait asymmetry—a key metric for assessing rehabilitation outcomes
in stroke patients—has been estimated to range between 10 and 20% [38]. vGait meets
these precision requirements with its stride time asymmetry error (RMSEABS of about 6%)
and stride length asymmetry error (RMSEABS of about 2–3%).
Some limitations of vGait should be acknowledged. Thus far, we have only validated
the approach in healthy participants. Nevertheless, since the underlying pose-tracking
method identifies keypoints frame-by-frame and does not rely on a specific movement pro-
file, we anticipate that its reliability will extend to pathological gait patterns [12]. Another
limiting factor is the low temporal sampling rate of the RGB-D sensor used in this study
(Kinect Azure), which operates at 30 Hz. Combined with the inherent temporal imprecision
of the step detection algorithm (for both the gold standard and vGait) of >20 ms [27],
these factors likely contribute significantly to the observed temporal variability and side
asymmetry. This may explain why these aspects did not achieve excellent agreement with
the gold standard. Future studies should aim to use RGB-D sensors with higher temporal
resolution when possible. Finally, we validated vGait using a particular RGB-D technology
(time-of-flight via Kinect Azure), so future studies should investigate whether other RGB-D
systems—such as those integrated into smartphones or tablets, employing different depth-
mapping technologies (e.g., stereo vision, structured light, LiDAR)—perform similarly.
Such findings could pave the way for broadly accessible, cost-effective, and mobile 3D
gait-analysis solutions.
5. Conclusions
In summary, we have demonstrated that clinically reliable gait analysis can be achieved
using a single integrated RGB-D sensor (vGait), providing good-to-excellent agreement
with gold-standard marker-based methods across a broad range of spatiotemporal gait
parameters. Moreover, the approach can be flexibly applied from different perspectives.
Future research should validate vGait in different patient populations, including individ-
uals with hypokinetic or ataxic gait disorders, and explore its applicability to alternative
RGB-D sensor technologies without significant loss in quality.
Author Contributions: Conceptualization, R.S. and M.W.; methodology, J.D. and M.W.; software,
J.D. and M.W.; validation, L.B., J.B. and M.W.; formal analysis, L.B., J.D. and M.W.; investigation,
L.B., J.B., R.S., J.D. and M.W.; resources, R.S., J.D. and M.W.; data curation, L.B., J.B. and M.W.;
writing—original draft preparation, L.B. and M.W.; writing—review and editing, J.B., R.S. and J.D.;
visualization, M.W.; supervision, M.W.; project administration, M.W.; funding acquisition, M.W. All
authors have read and agreed to the published version of the manuscript.
Funding: This research was funded by the German Federal Ministry for Education and Science, grant
number 01EO1401 and 13GW0490B.
Institutional Review Board Statement: The study protocol was approved by the ethics committee
of the medical faculty of the University of Munich (LMU, 34-16), and the study was conducted in
conformity with the Declaration of Helsinki.
Informed Consent Statement: Informed consent was obtained from all subjects involved in the study.
Data Availability Statement: Sample datasets and gait analysis scripts used in this study are publicly
available at https://github.com/DSGZ-MotionLab/vGait, accessed on 19 December 2024. The
complete data from this study can be obtained upon reasonable request from M.W. The participants
did not consent to the publication of their sensor data in open repositories in accordance with
European data protection laws.
Acknowledgments: The authors sincerely thank Karen Otte for her valuable support in data collection.
References
1. Jahn, K.; Zwergal, A.; Schniepp, R. Gait disturbances in old age: Classification, diagnosis, and treatment from a neurological
perspective. Dtsch. Arztebl. Int. 2010, 107, 306–316. [PubMed]
2. Snijders, A.H.; van de Warrenburg, B.P.; Giladi, N.; Bloem, B.R. Neurological gait disorders in elderly people: Clinical approach
and classification. Lancet Neurol. 2007, 6, 63–74. [CrossRef] [PubMed]
3. Nonnekes, J.; Goselink, R.J.M.; Ruzicka, E.; Fasano, A.; Nutt, J.G.; Bloem, B.R. Neurological disorders of gait, balance and posture:
A sign-based approach. Nat. Rev. Neurol. 2018, 14, 183–189. [CrossRef]
4. Goetz, C.G.; Tilley, B.C.; Shaftman, S.R.; Stebbins, G.T.; Fahn, S.; Martinez-Martin, P.; Poewe, W.; Sampaio, C.; Stern, M.B.; Dodel,
R.; et al. Movement Disorder Society-sponsored revision of the Unified Parkinson’s Disease Rating Scale (MDS-UPDRS): Scale
presentation and clinimetric testing results. Mov. Disord. 2008, 23, 2129–2170. [CrossRef]
5. Kurtzke, J.F. Rating neurologic impairment in multiple sclerosis. Neurology 1983, 33, 1444. [CrossRef]
6. Schmitz-Hübsch, T.; du Montcel, S.T.; Baliko, L.; Berciano, J.; Boesch, S.; Depondt, C.; Giunti, P.; Globas, C.; Infante, J.; Kang, J.S.;
et al. Scale for the assessment and rating of ataxia: Development of a new clinical scale. Neurology 2006, 66, 1717–1720. [CrossRef]
7. Heldman, D.A.; Espay, A.J.; LeWitt, P.A.; Giuffrida, J.P. Clinician versus machine: Reliability and responsiveness of motor
endpoints in Parkinson’s disease. Park. Relat. Disord. 2014, 20, 590–595. [CrossRef]
8. Krebs, D.E.; Edelstein, J.E.; Fishman, S. Reliability of observational kinematic gait analysis. Phys. Ther. 1985, 65, 1027–1033.
[CrossRef]
9. Saleh, M.; Murdoch, G. In defence of gait analysis. Observation and measurement in gait assessment. J. Bone Jt. Surg. Br. 1985, 67,
237–241. [CrossRef]
10. Ilg, W.; Golla, H.; Thier, P.; Giese, M.A. Specific influences of cerebellar dysfunctions on gait. Brain 2007, 130 Pt 3, 786–798.
[CrossRef]
11. Raccagni, C.; Nonnekes, J.; Bloem, B.R.; Peball, M.; Boehme, C.; Seppi, K.; Wenning, G.K. Gait and postural disorders in
parkinsonism: A clinical approach. J. Neurol. 2020, 267, 3169–3176. [CrossRef] [PubMed]
12. Mathis, A.; Schneider, S.; Lauer, J.; Mathis, M.W. A Primer on Motion Capture with Deep Learning: Principles, Pitfalls, and
Perspectives. Neuron 2020, 108, 44–65. [CrossRef]
13. Uhlrich, S.D.; Falisse, A.; Kidzinski, L.; Muccini, J.; Ko, M.; Chaudhari, A.S.; Hicks, J.L.; Delp, S.L. OpenCap: Human movement
dynamics from smartphone videos. PLoS Comput. Biol. 2023, 19, e1011462. [CrossRef]
14. Moro, M.; Marchesi, G.; Hesse, F.; Odone, F.; Casadio, M. Markerless vs. Marker-Based Gait Analysis: A Proof of Concept Study.
Sensors 2022, 22, 2011. [CrossRef]
15. Stenum, J.; Hsu, M.M.; Pantelyat, A.Y.; Roemmich, R.T. Clinical gait analysis using video-based pose estimation: Multiple
perspectives, clinical populations, and measuring change. PLoS Digit. Health 2024, 3, e0000467. [CrossRef]
16. Springer, S.; Yogev Seligmann, G. Validity of the Kinect for Gait Assessment: A Focused Review. Sensors 2016, 16, 194. [CrossRef]
17. Steinert, A.; Sattler, I.; Otte, K.; Rohling, H.; Mansow-Model, S.; Muller-Werdan, U. Using New Camera-Based Technologies for
Gait Analysis in Older Adults in Comparison to the Established GAITRite System. Sensors 2019, 20, 125. [CrossRef]
18. Clark, R.A.; Bower, K.J.; Mentiplay, B.F.; Paterson, K.; Pua, Y.H. Concurrent validity of the Microsoft Kinect for assessment of
spatiotemporal gait variables. J. Biomech. 2013, 46, 2722–2725. [CrossRef]
19. Pfister, A.; West, A.M.; Bronner, S.; Noah, J.A. Comparative abilities of Microsoft Kinect and Vicon 3D motion capture for gait
analysis. J. Med. Eng. Technol. 2014, 38, 274–280. [CrossRef]
20. Xu, X.; McGorry, R.W.; Chou, L.S.; Lin, J.H.; Chang, C.C. Accuracy of the Microsoft Kinect for measuring gait parameters during
treadmill walking. Gait Posture 2015, 42, 145–151. [CrossRef]
21. Müller, B.; Ilg, W.; Giese, M.A.; Ludolph, N. Validation of enhanced kinect sensor based motion capturing for gait assessment.
PLoS ONE 2017, 12, e0175813. [CrossRef] [PubMed]
22. Geerse, D.J.; Coolen, B.H.; Roerdink, M. Kinematic Validation of a Multi-Kinect v2 Instrumented 10-Meter Walkway for
Quantitative Gait Assessments. PLoS ONE 2015, 10, e0139913. [CrossRef] [PubMed]
23. Hazra, S.; Pratap, A.A.; Tripathy, D.; Nandy, A. Novel data fusion strategy for human gait analysis using multiple kinect sensors.
Biomed. Signal Process. Control 2021, 67, 102512. [CrossRef]
24. Bertram, J.; Kruger, T.; Rohling, H.M.; Jelusic, A.; Mansow-Model, S.; Schniepp, R.; Wuehr, M.; Otte, K. Accuracy and repeatability
of the Microsoft Azure Kinect for clinical measurement of motor function. PLoS ONE 2023, 18, e0279697. [CrossRef]
25. Jocher, G.; Chaurasia, A.; Qiu, J. Ultralytics YOLO; Ultralytics: Frederick, MD, USA, 2023; Volume 8.0.0.
26. Jiang, T.; Lu, P.; Zhang, L.; Ma, N.; Han, R.; Lyu, C.; Li, Y.; Chen, K. Rtmpose: Real-time multi-person pose estimation based on
mmpose. arXiv 2023, arXiv:2303.07399.
27. Bonci, T.; Salis, F.; Scott, K.; Alcock, L.; Becker, C.; Bertuletti, S.; Buckley, E.; Caruso, M.; Cereatti, A.; Del Din, S.; et al. An
Algorithm for Accurate Marker-Based Gait Event Detection in Healthy and Pathological Populations During Complex Motor
Tasks. Front. Bioeng. Biotechnol. 2022, 10, 868928. [CrossRef]
Sensors 2025, 25, 333 11 of 11
28. Romijnders, R.; Warmerdam, E.; Hansen, C.; Schmidt, G.; Maetzler, W. A Deep Learning Approach for Gait Event Detection from
a Single Shank-Worn IMU: Validation in Healthy and Neurological Cohorts. Sensors 2022, 22, 3859. [CrossRef]
29. Koo, T.K.; Li, M.Y. A Guideline of Selecting and Reporting Intraclass Correlation Coefficients for Reliability Research. J. Chiropr.
Med. 2016, 15, 155–163. [CrossRef]
30. Ellrich, N.; Niermeyer, K.; Peto, D.; Decker, J.; Fietzek, U.M.; Katzdobler, S.; Höglinger, G.U.; Jahn, K.; Zwergal, A.; Wuehr, M.
Precision Balance Assessment in Parkinson’s Disease: Utilizing Vision-Based 3D Pose Tracking for Pull Test Analysis. Sensors
2024, 24, 3673. [CrossRef]
31. Lord, S.; Galna, B.; Verghese, J.; Coleman, S.; Burn, D.; Rochester, L. Independent domains of gait in older adults and associated
motor and nonmotor attributes: Validation of a factor analysis approach. J. Gerontol. A Biol. Sci. Med. Sci. 2013, 68, 820–827.
[CrossRef]
32. Lord, S.; Galna, B.; Rochester, L. Moving forward on gait measurement: Toward a more refined approach. Mov. Disord. 2013, 28,
1534–1543. [CrossRef] [PubMed]
33. Gardner, A.W.; Montgomery, P.S.; Wang, M. Minimal clinically important differences in treadmill, 6-minute walk, and patient-
based outcomes following supervised and home-based exercise in peripheral artery disease. Vasc. Med. 2018, 23, 349–357.
[CrossRef] [PubMed]
34. Bohannon, R.W.; Glenney, S.S. Minimal clinically important difference for change in comfortable gait speed of adults with
pathology: A systematic review. J. Eval. Clin. Pract. 2014, 20, 295–300. [CrossRef] [PubMed]
35. Schniepp, R.; Mohwald, K.; Wuehr, M. Gait ataxia in humans: Vestibular and cerebellar control of dynamic stability. J. Neurol.
2017, 264 (Suppl. S1), 87–92. [CrossRef]
36. Hausdorff, J.M. Gait variability: Methods, modeling and meaning. J. Neuroeng. Rehabil. 2005, 2, 19. [CrossRef]
37. Baudendistel, S.T.; Haussler, A.M.; Rawson, K.S.; Earhart, G.M. Minimal clinically important differences of spatiotemporal gait
variables in Parkinson disease. Gait Posture 2024, 108, 257–263. [CrossRef]
38. Lewek, M.D.; Randall, E.P. Reliability of spatiotemporal asymmetry during overground walking for individuals following chronic
stroke. J. Neurol. Phys. Ther. 2011, 35, 116–121. [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.