Next Article in Journal
Spatio-Temporal Attention Model for Foreground Detection in Cross-Scene Surveillance Videos
Next Article in Special Issue
Development of an Error Model for a Factory-Calibrated Continuous Glucose Monitoring Sensor with 10-Day Lifetime
Previous Article in Journal
Anti-Clutter Gaussian Inverse Wishart PHD Filter for Extended Target Tracking
Previous Article in Special Issue
Yoga Posture Recognition and Quantitative Evaluation with Wearable Sensors Based on Two-Stage Classifier and Prior Bayesian Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Wearable-Sensor-Based Detection and Prediction of Freezing of Gait in Parkinson’s Disease: A Review

1
Department of Systems Design Engineering, University of Waterloo, Waterloo, ON N2L 3G1, Canada
2
School of Human Kinetics, Faculty of Health Sciences, University of Ottawa, Ottawa, ON K1N 6N5, Canada
3
Faculty of Medicine, University of Ottawa, Ottawa Hospital Research Institute, Ottawa, ON K1H 8M2, Canada
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(23), 5141; https://doi.org/10.3390/s19235141
Submission received: 31 October 2019 / Revised: 19 November 2019 / Accepted: 20 November 2019 / Published: 24 November 2019
(This article belongs to the Special Issue Wearable Sensors in Healthcare: Methods, Algorithms, Applications)

Abstract

:
Freezing of gait (FOG) is a serious gait disturbance, common in mid- and late-stage Parkinson’s disease, that affects mobility and increases fall risk. Wearable sensors have been used to detect and predict FOG with the ultimate aim of preventing freezes or reducing their effect using gait monitoring and assistive devices. This review presents and assesses the state of the art of FOG detection and prediction using wearable sensors, with the intention of providing guidance on current knowledge, and identifying knowledge gaps that need to be filled and challenges to be considered in future studies. This review searched the Scopus, PubMed, and Web of Science databases to identify studies that used wearable sensors to detect or predict FOG episodes in Parkinson’s disease. Following screening, 74 publications were included, comprising 68 publications detecting FOG, seven predicting FOG, and one in both categories. Details were extracted regarding participants, walking task, sensor type and body location, detection or prediction approach, feature extraction and selection, classification method, and detection and prediction performance. The results showed that increasingly complex machine-learning algorithms combined with diverse feature sets improved FOG detection. The lack of large FOG datasets and highly person-specific FOG manifestation were common challenges. Transfer learning and semi-supervised learning were promising for FOG detection and prediction since they provided person-specific tuning while preserving model generalization.

1. Introduction

Parkinson’s disease (PD) is a progressive neurodegenerative condition that presents numerous life-altering symptoms, including the characteristic upper-limb trembling [1]. In moderate to advanced PD, locomotion can deteriorate into a flexed upper body posture with small shuffling steps, an anteriorly-shifted centre of mass, decreased walking speed, poor balance, increased gait variability, and freezing of gait (FOG) [2,3,4,5,6].
A FOG episode is a complex and highly-variable phenomenon defined as a “brief, episodic absence or marked reduction of forward progression of the feet despite the intention to walk” [7]. Freezing is often described as the sensation of having one’s feet glued to the floor with an inability to initiate the next step, and becomes increasing common as PD progresses [2,8]. Although typically lasting only a few seconds [9], freezes can lead to falls [10,11,12]. Since FOG can occur multiple times a day, most commonly between doses when medication is wearing off [11,13], FOG related fall risk is an ever-present concern. Fall-related injury, reduced mobility, fear of future falls, and decreased independence are all linked to FOG and can contribute to a reduced quality of life [14,15,16,17,18].
FOG occurrences are difficult to anticipate and may not manifest during clinic visits [3]. Therefore, assessing and adjusting FOG treatments can be challenging for medical professionals. In-home monitoring and automatic freeze-detection systems have been developed [19,20] and used to objectively track freezes over extended periods; however, these systems do not prevent or reduce freezing occurrences. Cueing devices that provide an external stimulus have emerged for preventing imminent or overcoming occurring FOG episodes [21,22]. Continuous stimuli include auditory (e.g., a rhythmic tone), visual (e.g., lines projected on the floor in front of the person), or tactile (e.g., a vibrating device on the skin). As an alternative to continuous cueing, some cueing systems detect an occurring freeze and provide a cue to help the person resume normal walking [23,24,25,26]. If FOG could be predicted, a cue could be provided before the event to prevent the freeze from occurring [27,28].
Accurate and automatic FOG detection and prediction are essential for long-term symptom monitoring or preemptive mitigation via cueing. Wearable sensors are vital for FOG detection and prediction systems, to ensure unrestricted daily use in a person’s chosen environment. The complexity of FOG symptoms and its highly-variable manifestations have led to the creation of systems with numerous sensors on various body parts and a wide array of FOG detection methods, ranging from simple thresholds to machine-learning approaches. While good results have been reported, automatic and reliable FOG detection and prediction is far from resolved. Previous review of the state of the art technology in this field has shown the widespread use and effectiveness of wearable sensors in FOG detection [26]. To help guide further research, the current paper presents and up-to-date review of the state of the art of FOG prediction, and provides further details on the study populations, classification methods, and features used. The current state of the art is also assessed to highlight current challenges and limitations in FOG detection and prediction using wearable sensors. The outcomes from this review provide guidance on current knowledge and identify knowledge gaps that need to be filled to advance wearable-sensor-based assistive technologies that can improve the lives of people with PD.

2. Materials and Methods

A literature review was performed by searching the Scopus, PubMed, and Web of Science databases. Keywords included “sensor” or “device” or “wearable”, “Parkinson”, “freeze” or “freezing”, “detect”, or “predict”. The final search was performed on April 16, 2019. Results were curated using the Mendeley Desktop software (v. 1.19.4) (Elsevier, Amsterdam, the Netherlands).
Duplicates were removed and the results were pooled for screening, using article title, abstract, and keywords to determine relevance. Following screening, the remaining documents were reviewed in full.
Eligibility for analysis was based on:
  • Use wearable sensor data as input (direct from sensor or wearable sensor datasets).
  • Involve people with PD, or data from people with PD, who experience FOG.
  • Primary goal of detecting or predicting FOG. Articles were not included if they examined cueing using a FOG detection method developed in previous research and reported in another article, or if they only classified individuals as freezers or non-freezers, rather than detecting freezing episodes.
Articles were excluded if they were not published in English, if they were not full texts (abstract only publications were excluded), or if they lacked adequate descriptions and explanations of the detection or prediction methods (i.e., training and testing methods not described, important variables not defined, results not presented).
Eligible articles were used to extract, where available, the following characteristics: population, data collection location and summary, sensor type and location, FOG detection and prediction method (i.e., classifier or machine-learning algorithm), features, whether feature extraction and selection were used, classification performance, and evaluation in real-time.
Article characteristics included:
  • Population: The number of participants in the study, i.e.: healthy controls (HC), people with FOG symptoms (FOG-PD), people with no FOG symptoms (NFOG-PD), and FOG symptom status unknown or not reported (UFOG-PD); the number of PD participants who froze during data collection, medication state during data collection (ON or OFF), number of FOG episodes.
  • Data collection location and summary: Whether data collection was performed in a laboratory setting or in the participant’s home. Summary of walking tasks performed.
  • Sensor type and location: The type and number of sensors used, sensor location on the body.
  • FOG detection method: Methods used to detect and predict FOG, i.e., general approach (e.g., machine-learning model), model training method (person-specific: trained using data from a single person; or person-independent: trained using data from multiple people and not customized for an individual), whether the data was windowed, window length, and extent of detection (i.e., detection performed on each data point, window, or FOG event, etc.). Where multiple methods were attempted, the method with the best performance or research focus was reported.
  • Feature extraction and feature selection: Features are variables calculated from sensor data. Feature selection uses feature ranking, filtering, or other techniques to produce an appropriate feature subset with fewer redundant features. Reporting features that performed best in FOG detection or comparing detection performance of different features after model testing was not considered as feature selection.
  • Classifier performance: Sensitivity, specificity, other performance metrics reported.
  • Real-time: Reporting the detection of a FOG episode as it occurs. In this review, real-time refers to detection using a live wearable-sensor data stream.
Feature analysis included:
  • Feature Name: Feature name or a short description if not named in the cited article.
  • Sensor Type: The type of sensor to calculate the feature: accelerometer (Acc), gyroscope (Gyro), force sensitive resistor (FSR), electromyography (EMG), electroencephalogram (EEG), galvanic skin response (GSR), goniometer, telemeter, or camera-based motion capture (CBMC) (included if used with wearable-sensor).
  • Sensor Location: Body location where the sensor was placed.
  • Feature Description: Brief explanation of the feature.
  • Source: Articles that used the feature as input for FOG detection or prediction.

3. Results

The initial search provided 323 documents. An additional 10 articles that did not appear in the search but were referenced by other articles were included, resulting in 333 articles. After removing duplicates, 178 documents were available (Figure 1). Following screening and eligibility assessment, 74 articles were included in the review: 68 on FOG detection, seven on FOG prediction, and one article in both categories.
Study characteristics related to population, data collection location and summary, sensor type and location, FOG detection method, feature extraction and selection, classifier performance, and whether analysis was performed in real time are presented in Table 1. Features extracted from wearable sensor data are presented in Table 2. Table 3 presents a summary of the top machine-learning methods from studies that compared different machine-learning classifiers for FOG detection using wearable sensors.

4. Discussion

4.1. FOG Detection

FOG detection methods vary in complexity, with the simplest models directly comparing wearable sensor variables to thresholds [29,41,44,46,53,63,73,123]. Threshold methods tended to have poorer detection performance but faster processing time, making them potentially useful in real-time systems [24,70,77,79,92,93]. To improve classification performance, features that can better differentiate between FOG and typical PD gait have been used, such as Fourier transforms [29,34,35,41,44,53,65,69,78], wavelet transforms [51,56,63,71,79,83,91,92,93,96], k-index [59,60,61,62,72,73], freezing of gait criterion (FOGC) [46], freezing of gait detection on glasses (FOGDOG) [70], R-index [94], and the widely-used freeze index [29].
To further improve FOG detection performance, multiple features and machine-learning (ML) techniques have been used, such as neural networks [36,38,55,66,76,80,85,86,88,89,91], decision trees [25,39,42,45,52,54,58,85], random forests, [39,42,43] naïve Bayes [42,43], nearest neighbor [42], and support vector machines [64,71,74,75,81,83,86]. In addition, anomaly detection [20] and unsupervised machine learning have been attempted [87], but not extensively explored.
The best performing classifiers for FOG detection were convolutional neural networks, support vector machines, random forest, and AdaBoosted decision trees (Table 3).

4.1.1. Decision Trees

Decision trees are a series of binary selections that form branches resembling a tree structure. More complex decision trees can improve performance. For example, random forest classifiers use multiple decision trees, where the final decision is the majority vote of the individual trees. Boosting can also improve performance. AdaBoosting (adaptive boosting) repeatedly retrains the classifier, placing increasing importance on incorrectly classified training examples [124,125]. LogitBoosting (logistic boosting) [126], RUSBoosting [127], and RobustBoosting [128] are extensions of AdaBoosting that can further improve performance [85]. Decision trees for FOG detection included ensembles of trees and boosting techniques [42,43,85], with performance results ranging from 66.25% to 98.35% for sensitivity and 66.00% to 99.72% for specificity [25,39,42,43,45,52,54,58,85].

4.1.2. Support Vector Machines (SVM)

Support vector machines are binary (two class) classifiers that trace a plane to separate data points from each class. New data points are then classified based on their side of the plane. If data points are not easily separable, a kernel can transform the data into a dimension that is linearly separable [125]. SVM for FOG detection achieved 74.7%–99.73% sensitivity and 79.0%–100% specificity [64,71,74,75,81,83,86].

4.1.3. Neural Networks

Neural networks (NN) are made up of interconnected layers of nodes inspired by the structure of neurons in the brain [129]. NN have been frequently used in FOG detection and prediction studies. For FOG detection, model performance achieved 72.2%–99.83% sensitivity and 48.4%–99.96% specificity [36,38,55,66,76,80,85,86,88,89,91]. Neural networks for FOG prediction tended to perform slightly worse, up to 86% sensitivity, 80.25% specificity, and 89% precision [96,97,99].
Different NN subtypes have been used in FOG detection and prediction, such as convolutional [85,90] and recurrent [97,100] NN. Convolutional neural networks (CNN) have become popular in numerous applications, including medical image analysis, in part due their ability to recognize local patterns within images and because feature selection prior to implementation is not required [130,131]. CNN performed well for FOG detection [85], achieving 91.9% sensitivity and 89.5% specificity. Recurrent NN have recently been used for FOG prediction due to their applicability to time-series data [97,100]. Recurrent neural networks (RNN) utilize previous data in addition to current inputs during classification [132], thus giving the network “memory” to help recognize sequences [133]. A long short-term memory network (LSTM), a type of RNN, was used for FOG prediction [100], achieving over 90% accuracy when predicting FOG 5 s in advance.

4.1.4. Unsupervised and Semi-Supervised Models

Since freezing manifests differently for each person, person-specific models outperformed person-independent models [42,58,74,86] (with some exceptions as in [53]). However, in practice, it is difficult to obtain enough data to develop a model for an individual. To address this small dataset problem, unsupervised learning has been attempted. These methods do not rely on experts labelling FOG episodes. Instead, clustering techniques are used to define the classes [87], or an anomaly detection approach is used to define the normal class and then identify abnormalities (such as FOG) that do not conform to that class [45,90]. Unsupervised FOG detection approaches are appealing since they do not require data labelling; however, few studies have used unsupervised FOG detection, and unsupervised models performance has been worse than supervised models [90].
Recently, transfer learning, which uses a previously-trained network as a base and adapts the model to a new task [100], and semi-supervised learning, which uses both labeled and unlabeled data during training [69,88,89], have been used to create partly personalized FOG detection methods without large amounts of data. In [100], transfer learning trained a neural network using group data before adding an additional network layer that was trained using an individual’s data. Semi-supervised learning methods [69,88,89] use labeled data to train a base classifier before updating in an unsupervised manner. This reduces the need for labeled data and preserves the generalization ability from a multiple person data set, while allowing person-specific tuning. Semi-supervised learning theoretically combines the advantages of both supervised and unsupervised learning. When applied to FOG detection, performance achieved 89.2%–95.9% sensitivity [69,88,89] and 93.1%–95.6% specificity [69,88,89]. Although the methods are promising, due to a current shortage of studies, the value of these methods for FOG detection remains unclear.

4.1.5. Limitations and Challenges of FOG Detection

FOG detection and prediction is affected by the participant’s medication state (ON and OFF), with substantial effects on motor control, gait patterns, and physical abilities. Freezing occurs more frequently in the OFF state than the ON state. In the OFF state, smaller shuffling steps are common, whereas in the ON state, many people can walk fairly normally. A machine-learning model trained during a person’s optimal medication state may perform worse if the medication wears off and their unassisted gait changes. Given that medication is needed in PD management, medication state is crucial contextual information for FOG detection and prediction research.
With machine-learning algorithms becoming more prevalent, larger FOG detection and prediction datasets are needed for model development. FOG studies ranged from 1 to 32 participants, with most studies having more than 10 participants. Studies involving few participants may not adequately validate a FOG detection method, especially when machine-learning algorithms are involved. Data augmentation techniques [85] or additional testing with more participants are required. On the other hand, large participant pools may not guarantee unbiased datasets since some participants freeze many times during data collection, while others may not freeze at all. For example, in [48], only 6 of 20 participants froze during data collection, which may lead to person-biased models that over-represent the few individuals with FOG data. Difficulty in participant recruitment and FOG unpredictability are therefore challenges that may limit the availability and quality of training data.
Following data collection, FOG episodes are typically visually identified and labelled. Visual FOG identification is currently the gold standard. These labels are ground truth for detection method validation. Even though FOG is a well-defined clinical phenomenon [7], the criteria for defining the beginning and end of FOG episodes [24,25,98] was not defined in some articles. Differing FOG definitions make comparison between studies problematic. Published datasets can provide consistent ground truth FOG labelling. The Daphnet [24] (10 participants) and CuPiD [101] (18 participants) datasets provide consistent input but fewer than 250 FOG episodes; thus, dataset size may be an issue for machine learning, especially if deep learning is used [85].
When evaluating a classification system, ideally, different data are used for training and testing, as in [25,38,51,55,56,64,66,67,85,96,97,99,100], in order to prevent model performance overestimation that can occur when the model is evaluated using data used in model training. Cross-validation is often used when the dataset size is limited, as done in [24,31,32,33,39,42,43,45,52,54,58,74,75,81,86,87,88,89,90,98]. For FOG research, leave-one-person-out cross-validation was the most common. In this method, model training used data from all but one participant, model testing used data from the remaining participant, the process was repeated for each participant, and the performance results were averaged. Other studies, often more preliminary in nature, used ad hoc optimization to tune parameters and set thresholds [34,44,48,59,60,61,62,63,95]. This approach, although useful for initial system assessment, is not a good indicator of classifier performance, and should be followed by a more robust evaluation scheme, such as cross-validation.
Feature calculation from wearable sensor data is typically done using data windows. Window lengths ranged from 0.2 to 32 s [36,48,92,93], with the most common window length being 1 s. Long windows with many sample points are desirable for calculating frequency-based features involving the discrete Fourier transform, since the number of sample points in the input signal will determine the output frequency bin resolution. However, long windows decrease the temporal resolution and do not permit distinguishing short events within the window. In addition, long windows with many data points may be slower to process and may introduce unwanted lags between data acquisition and classification for detection or prediction. Studies comparing multiple window lengths found that, in general, 1–4 s windows are preferable [42,44,48,57,63,64].
FOG detection studies used different performance metrics. For example, a FOG detection system used to trigger a real-time cue during walking might emphasize freeze onset detection. This detection system might attempt to classify every data point or window as FOG or no freeze, and be evaluated using the number of correctly classified instances [24,31,32,33]. In contrast, a long-term monitoring system may treat each freeze occurrence as a binary event and evaluate whether the FOG event was successfully detected [74,75]. Experimental procedures and underlying definitions, such as ignoring FOG shorter than 3 s [43] or calculating specificity with data from participants without FOG [64], also varied between studies. Differences in evaluation metrics and procedures make FOG detection method comparisons more difficult.
To help compare future FOG detection and prediction studies, researchers should include study population details; including, sex, PD severity, number of participants, the number and duration of FOG episodes (ideally for each person), and medication state during testing. Methodologically, the FOG labelling criterion, detailed detection method, validation method, and basis upon which the performance evaluation metrics are calculated should be clearly stated.

4.2. FOG Prediction

The FOG prediction studies varied in approach and performance, with most being somewhat preliminary and focusing less on performance and more on understanding the intricacies of FOG prediction. In addition to FOG detection study considerations (e.g., dataset size, medication state, FOG definitions, contextual or study-specific performance metric definitions), FOG prediction studies must define the pre-FOG class using data before freeze onset. FOG prediction is typically done by training a machine-learning model to recognize data from the pre-FOG class. Six of the seven FOG prediction studies selected a pre-FOG segment duration that ranged from 1 s [45,96,100] to 6 s [45]. Since the transition from walking to FOG is subtle, labelling the start of pre-FOG from visual observation is difficult. Instead, a FOG episode is visually identified, and data prior to the FOG are selected using a single fixed duration. Three studies used a 5 s period [96,97,99]; one study used a 2 s period [98]; one used 1,3 and 5 s periods [100]; and one used 1–6 s periods, in 1 s increments [45]. The seventh study [95] used an assumed 3 s period before FOG for feature selection; then, a person-specific, multivariate Gaussian-distribution-based anomaly-detection model was created and manually tuned for each participant.
Optimal pre-FOG segment duration is difficult to determine. If the pre-FOG segment is assumed to be a linear degradation of gait leading to FOG (threshold theory [134]), data closest to the freeze would resemble FOG, and data farther from the freeze would resemble typical PD walking. For a two-class classifier (pre-FOG, typical PD walking), short pre-FOG segments are preferred, since data are closer to FOG onset and likely more distinct from typical walking [100].
A short pre-FOG segment may not be ideal when using a three-class classifier consisting of typical PD walking, pre-FOG, and FOG classes as in [45], which found that very short pre-FOG segments made it difficult to distinguish between the pre-FOG and FOG classes. Longer pre-FOG segments improved pre-FOG classification but greatly reduced FOG and typical walking classification accuracy. The best performing pre-FOG segment duration differed across participants, and likely between individual FOG episodes for the same person [45]. The observation that a single pre-FOG duration is inadequate is also supported by [95,98]. For this reason, a person-specific or episode-specific pre-FOG duration may help to reduce overlap with the walking class and increase class purity (contain only pre-FOG data), thus improving pre-FOG detection performance.

4.3. Features Used in FOG Detection and Prediction

A variety of features have been used in FOG detection and prediction. While most FOG detection and prediction features were previously established in non-PD applications [135,136,137,138], custom features were created to detect FOG, namely, freezing of gait criterion (FOGC) [46,47], freezing of gait detection on glasses (FOGDOG) [70], k value [59,60,61,62,72,73], R value [94], freeze index [29], K freeze index [67], and multichannel freeze index [67]. Time domain features, such as maximum acceleration amplitude within a window [40,41] or rotation about a single axis [98], are relatively simple and fast to compute. Gait-based features such as cadence [49], stride duration [35,50,71], and step length [35,71,77], as well as statistical features including mean [42,45,52,54,57,58,71,74,75,77,80,81,95], standard deviation [42,45,52,54,57,58,71,74,75,77,80,81,88,89,95,98], and root mean square [45,48,54,78,86] are also calculated from time domain data. Frequency domain features include freeze index (FI) [29], which was the most widely-used frequency domain feature [23,24,25,29,31,32,33,34,42,44,49,53,54,64,65,69,71,77,78,80,86,88,89,98], peak amplitude and corresponding frequency [40,41,64], standard deviation in frequency domain [50,57,64,74], spectral density centre of mass [50,57,66,74,80,81,86,96], and power of the signal in specific frequency bands [24,25,31,32,33,34,40,42,52,54,58,75,80]. While Fourier transforms are typically used to convert signals from the time domain to frequency domain, Fourier transform limitations have led to increased usage of wavelet approaches [51,56,63,71,79,83,91,92,93,96].
A feature set can be more representative of the wide range of FOG manifestations. Studies that combined time and frequency domains features [96] had better performance than either type of feature individually. Time domain features can account for gait parameters such as step length [35,71,77] cadence [49], asymmetry [45], and peak limb angular velocity [88,89], whereas frequency domain features can capture more subtle patterns characteristic of FOG, such as trembling in specific frequency bands [29]. The best performance is typically achieved with multiple features.
The choice of features is very important, especially for real-time systems, where, in addition to classification performance considerations, classification speed is critical. For example, the calculation of stride duration at the end of the stride (approximately 1 s) could result in the delayed detection of a FOG event. Other features such as step length, cadence, cadence variation, stride peaks and FOGC may share this limitation, depending on the feature calculation method. Features extracted from appropriately-sized windowed data do not have this problem, since the features can be calculated as soon as the data window is available. The feature availability to the classifier is determined by the step size of the sliding window and calculation delay. All of the window-based features in Table 2 could be used in a real-time application, given sufficient processing power. However, an excessive number of features or complex features requiring many calculation stages may induce unacceptable delays when computing power is limited, as in many wearable systems. Using a minimal number of easily-calculated features is desirable; however, too few or overly-simple features may adversely impact classification performance. To address the delicate balance of classification performance versus classification speed, feature selection algorithms can be used to determine the best features from a larger set, as implemented in [45,51,55,56,58,66,67,76,80,83,86,95,96,98,99]. Algorithms such as the Relief-F or correlation-based approaches can be used to rank features according to their relevance so that the least relevant can be eliminated [139]. The most used feature selection methods in this review were paired t-tests [86,98], mutual information [45,58,67,95], and the Wilcoxon sum rank test [55,66,76,96]. The topic of feature selection is broad and encompasses numerous methods that can be used to improve classifier models. Given the diversity of features in the literature for detecting and predicting FOG, the best feature or feature set has yet to be determined. For future studies, it is generally suggested to begin with multiple features that can then be tuned or eliminated using feature section methods to produce a set of optimal features.

5. Conclusions

Based on 74 freezing of gait detection and prediction articles, this review reported details of the participants, walking task, sensors, features extracted, detection and prediction methods, and performance. The continued development of high-performing FOG detection methods is important for long-term monitoring and real-time cueing, and together with development of FOG prediction systems, is important for implementation in gait-assist systems. While FOG detection methods have been steadily increasing in performance, important challenges remain. Small FOG datasets may limit the machine-learning models that can be used, especially for deep learning. Sets of diverse features in both the time and frequency domains have helped to represent the inconsistent nature of FOG. The adoption of transfer learning, and semi-supervised learning models, built upon the established FOG detection methods, could add an element of personalization while preserving the robust generalization of person-independent models, thus making them promising approaches for future FOG detection and prediction research.

Author Contributions

Conceptualization, S.P., J.K., J.N. and E.D.L.; Methodology, S.P., J.K., J.N. and E.D.L.; Investigation, S.P.; Data curation, S.P.; Writing—original draft preparation, S.P.; Writing—review and editing, S.P., J.K., J.N. and E.D.L.; Supervision, J.K., J.N. and E.D.L.; Project administration, J.K.

Funding

This research was funded by the Natural Sciences and Engineering Research Council of Canada (NSERC) and University of Waterloo.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Jankovic, J. Parkinson’s disease: Clinical features and diagnosis. J. Neurol. Neurosurg. Psychiatry 2008, 79, 368–376. [Google Scholar] [CrossRef] [PubMed]
  2. Macht, M.; Kaussner, Y.; Möller, J.C.; Stiasny-Kolster, K.; Eggert, K.M.; Krüger, H.-P.; Ellgring, H. Predictors of freezing in Parkinson’s disease: A survey of 6620 patients. Mov. Disord. 2007, 22, 953–956. [Google Scholar] [CrossRef] [PubMed]
  3. Hausdorff, J.M.; Cudkowicz, M.E.; Firtion, R.; Wei, J.Y.; Goldberger, A.L. Gait variability and basal ganglia disorders: Stride-to-stride variations of gait cycle timing in Parkinson’s disease and Huntington’s disease. Mov. Disord. 1998, 13, 428–437. [Google Scholar] [CrossRef] [PubMed]
  4. Baltadjieva, R.; Giladi, N.; Gruendlinger, L.; Peretz, C.; Hausdorff, J.M. Marked alterations in the gait timing and rhythmicity of patients with de novo Parkinson’s disease. Eur. J. Neurosci. 2006, 24, 1815–1820. [Google Scholar] [CrossRef] [PubMed]
  5. Hausdorff, J.M. Gait dynamics in Parkinson’s disease: Common and distinct behavior among stride length, gait variability, and fractal-like scaling. Chaos Interdiscip. J. Nonlinear Sci. 2009, 19, 026113:1–026113:14. [Google Scholar] [CrossRef]
  6. Sofuwa, O.; Nieuwboer, A.; Desloovere, K.; Willems, A.-M.; Chavret, F.; Jonkers, I. Quantitative gait analysis in Parkinson’s disease: Comparison with a healthy control group. Arch. Phys. Med. Rehabil. 2005, 86, 1007–1013. [Google Scholar] [CrossRef]
  7. Nutt, J.G.; Bloem, B.R.; Giladi, N.; Hallett, M.; Horak, F.B.; Nieuwboer, A. Freezing of gait: Moving forward on a mysterious clinical phenomenon. Lancet Neurol. 2011, 10, 734–744. [Google Scholar] [CrossRef]
  8. Snijders, A.H.; Nijkrake, M.J.; Bakker, M.; Munneke, M.; Wind, C.; Bloem, B.R. Clinimetrics of freezing of gait. Mov. Disord. 2008, 23, 468–474. [Google Scholar] [CrossRef]
  9. Schaafsma, J.D.; Balash, Y.; Gurevich, T.; Bartels, A.L.; Hausdorff, J.M.; Giladi, N. Characterization of freezing of gait subtypes and the response of each to levodopa in Parkinson’s disease. Eur. J. Neurol. 2003, 10, 391–398. [Google Scholar] [CrossRef]
  10. Okuma, Y. Freezing of gait and falls in Parkinson’s disease. J. Park. Dis. 2014, 4, 255–260. [Google Scholar] [CrossRef]
  11. Okuma, Y.; Silva de Lima, A.L.; Fukae, J.; Bloem, B.R.; Snijders, A.H. A prospective study of falls in relation to freezing of gait and response fluctuations in Parkinson’s disease. Park. Relat. Disord. 2018, 46, 30–35. [Google Scholar] [CrossRef] [PubMed]
  12. Bloem, B.R.; Hausdorff, J.M.; Visser, J.E.; Giladi, N. Falls and freezing of gait in Parkinson’s disease: A review of two interconnected, episodic phenomena. Mov. Disord. 2004, 19, 871–884. [Google Scholar] [CrossRef] [PubMed]
  13. Nantel, J.; Bronte-Stewart, H. The effect of medication and the role of postural instability in different components of freezing of gait (FOG). Park. Relat. Disord. 2014, 20, 447–451. [Google Scholar] [CrossRef] [PubMed]
  14. Masud, T.; Morris, R.O. Epidemiology of falls. Age Ageing 2001, 30, 3–7. [Google Scholar] [CrossRef]
  15. Young, W.R.; Mark Williams, A. How fear of falling can increase fall-risk in older adults: Applying psychological theory to practical observations. Gait Posture 2015, 41, 7–12. [Google Scholar] [CrossRef]
  16. Ward-Griffin, C.; Hobson, S.; Melles, P.; Kloseck, M.; Vandervoort, A.; Crilly, R. Falls and fear of falling among community-dwelling seniors: The dynamic tension between exercising precaution and striving for independence. Can. J. Aging 2004, 23, 307–318. [Google Scholar] [CrossRef]
  17. Adkin, A.L.; Frank, J.S.; Jog, M.S. Fear of falling and postural control in Parkinson’s disease. Mov. Disord. 2003, 18, 496–502. [Google Scholar] [CrossRef]
  18. Bloem, B.R.; Grimbergen, Y.A.M.; Cramer, M.; Willemsen, M.; Zwinderman, A.H. Prospective assessment of falls in Parkinson’s disease. J. Neurol. 2001, 248, 950–958. [Google Scholar] [CrossRef]
  19. Kim, H.B.; Lee, H.J.; Lee, W.W.; Kim, S.K.; Jeon, H.S.; Park, H.Y.; Shin, C.W.; Yi, W.J.; Jeon, B.; Park, K.S. Validation of freezing-of-gait monitoring using smartphone. Telemed. e-Health 2018, 24, 899–907. [Google Scholar] [CrossRef]
  20. Pham, T.T.; Nguyen, D.N.; Dutkiewicz, E.; McEwan, A.L.; Leong, P.H.W. An anomaly detection technique in wearable wireless monitoring systems for studies of gait freezing in Parkinson’s disease. In Proceedings of the 2017 International Conference on Information Networking (ICOIN), Da Nang, Vietnam, 11–13 January 2017; pp. 41–45. [Google Scholar] [CrossRef]
  21. Ginis, P.; Nackaerts, E.; Nieuwboer, A.; Heremans, E. Cueing for people with Parkinson’s disease with freezing of gait: A narrative review of the state-of-the-art and novel perspectives. Ann. Phys. Rehabil. Med. 2018, 61, 407–413. [Google Scholar] [CrossRef]
  22. Nieuwboer, A. Cueing for freezing of gait in patients with Parkinson’s disease: A rehabilitation perspective. Mov. Disord. 2008, 23, 475–481. [Google Scholar] [CrossRef]
  23. Jovanov, E.; Wang, E.; Verhagen, L.; Fredrickson, M.; Fratangelo, R. deFOG—A real time system for detection and unfreezing of gait of Parkinson’s patients. In Proceedings of the 31st Annual International Conference of the IEEE Engineering in Medicine and Biology Society: Engineering the Future of Biomedicine, Minneapolis, MN, USA, 3–6 September 2009; pp. 5151–5154. [Google Scholar] [CrossRef]
  24. Bachlin, M.; Plotnik, M.; Roggen, D.; Maidan, I.; Hausdorff, J.M.; Giladi, N.; Troster, G. Wearable assistant for Parkinson’s disease patients with the freezing of gait symptom. IEEE Trans. Inf. Technol. Biomed. 2010, 14, 436–446. [Google Scholar] [CrossRef] [PubMed]
  25. Mazilu, S.; Blanke, U.; Hardegger, M.; Troster, G.; Gazit, E.; Dorfman, M.; Hausdorff, J.M. GaitAssist: A wearable assistant for gait training and rehabilitation in Parkinson’s disease. In Proceedings of the 2014 IEEE International Conference on Pervasive Computing and Communication Workshops (PERCOM WORKSHOPS), Budapest, Hungary, 24–28 March 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 135–137. [Google Scholar] [CrossRef]
  26. Silva de Lima, A.L.; Evers, L.J.W.; Hahn, T.; Bataille, L.; Hamilton, J.L.; Little, M.A.; Okuma, Y.; Bloem, B.R.; Faber, M.J. Freezing of gait and fall detection in Parkinson’s disease using wearable sensors: A systematic review. J. Neurol. 2017, 264, 1642–1654. [Google Scholar] [CrossRef] [PubMed]
  27. Nieuwboer, A.; Dom, R.; De Weerdt, W.; Desloovere, K.; Janssens, L.; Stijn, V. Electromyographic profiles of gait prior to onset of freezing episodes in patients with Parkinson’s disease. Brain 2004, 127, 1650–1660. [Google Scholar] [CrossRef] [PubMed]
  28. Ferster, M.L.; Mazilu, S.; Tröster, G. Gait parameters change prior to freezing in Parkinson’s disease: A data-driven study with wearable inertial units. In Proceedings of the 10th EAI International Conference on Body Area Networks, Sydney, Australia, 28–30 September 2015; EAI: Ghent, Belgium, 2015; pp. 159–166. [Google Scholar] [CrossRef]
  29. Moore, S.T.; MacDougall, H.G.; Ondo, W.G. Ambulatory monitoring of freezing of gait in Parkinson’s disease. J. Neurosci. Methods 2008, 167, 340–348. [Google Scholar] [CrossRef] [PubMed]
  30. Zabaleta, H.; Keller, T.; MartÍ Massó, J.F. Power spectral distribution analysis for detection of freezing of gait in patients with Parkinson’s disease. In Proceedings of the 4th European Conference of the International Federation for Medical and Biological Engineering, IFMBE, Antwerp, Belgium, 23–27 November 2008; Vander SLoten, J., Verdonck, P., Nyssen, M., Haueisen, J., Eds.; Springer: Berlin, Germany, 2008; pp. 2089–2092. [Google Scholar] [CrossRef]
  31. Bachlin, M.; Roggen, D.; Troster, G.; Plotnik, M.; Inbar, N.; Meidan, I.; Herman, T.; Brozgol, M.; Shaviv, E.; Giladi, N.; et al. Potentials of Enhanced Context Awareness in Wearable Assistants for Parkinson’s Disease Patients with the Freezing of Gait Syndrome. In Proceedings of the 2009 International Symposium on Wearable Computers, Linz, Austria, 4–7 September 2009; pp. 123–130. [Google Scholar] [CrossRef]
  32. Bächlin, M.; Plotnik, M.; Roggen, D.; Inbar, N.; Giladi, N.; Hausdorff, J.; Tröster, G. Parkinson’s disease patients’ perspective on context aware wearable technology for auditive assistance. In Proceedings of the 3rd International Conference on Pervasive Computing Technologies for Healthcare—Pervasive Health 2009, London, UK, 1–3 April 2009; Troester, G., Connelly, K., Arnrich, B., Eds.; IEEE: Piscataway, NJ, USA, 2009. [Google Scholar] [CrossRef] [Green Version]
  33. Bächlin, M.; Plotnik, M.; Roggen, D.; Giladi, N.; Hausdorff, J.M.; Tröster, G. A wearable system to assist walking of parkinsońs disease patients benefits and challenges of context-triggered acoustic cueing. Methods Inf. Med. 2010, 49, 88–95. [Google Scholar] [CrossRef]
  34. Bächlin, M.; Roggen, D.; Plotnik, M.; Hausdorff, J.M.; Giladi, N.; Tröster, G. Online detection of freezing of gait in Parkinson’s disease patients: A performance characterization. In Proceedings of the BODYNETS 2009—4th International ICST Conference on Body Area Networks, Los Angeles, CA, USA, 1–3 April 2009; Kaiser, W., Lu, C., Eds.; ICST: Brussels, Belgium, 2009. [Google Scholar] [CrossRef] [Green Version]
  35. Delval, A.; Snijders, A.H.; Weerdesteyn, V.; Duysens, J.E.; Defebvre, L.; Giladi, N.; Bloem, B.R. Objective detection of subtle freezing of gait episodes in Parkinson’s disease. Mov. Disord. 2010, 25, 1684–1693. [Google Scholar] [CrossRef]
  36. Djuric-Jovicic, M.; Jovicic, N.S.; Milovanovic, I.; Radovanovic, S.; Kresojevic, N.; Popovic, M.B. Classification of walking patterns in Parkinson’s disease patients based on inertial sensor data. In Proceedings of the 10th Symposium on Neural Network Applications in Electrical Engineering, Belgrade, Serbia, 23–25 September 2010; pp. 3–6. [Google Scholar] [CrossRef]
  37. Popovic, M.B.; Djuric-Jovicic, M.; Radovanovic, S.; Petrovic, I.; Kostic, V. A simple method to assess freezing of gait in Parkinson’s disease patients. Braz. J. Med. Res. 2010, 43, 883–889. [Google Scholar] [CrossRef] [Green Version]
  38. Cole, B.T.; Roy, S.H.; Nawab, S.H. Detecting freezing-of-gait during unscripted and unconstrained activity. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMB, Boston, MA, USA, 30 August–3 September 2011; pp. 5649–5652. [Google Scholar] [CrossRef]
  39. Tsipouras, M.G.; Tzallas, A.T.; Tripoliti, E.; Rigas, G.; Bougia, P.; Fotiadis, D.I.; Tsouli, S.; Konitsiotis, S. On assessing motor disorders in Parkinson’s disease. In Proceedings of the Second International ICST Conference, MobiHealth 2010, Ayia Napa, Cyprus, 18–20 October 2010; Lin, J., Nikita, K.S., Eds.; Springer: Berlin/Heidelberg, Germany, 2011; pp. 35–38. [Google Scholar] [CrossRef]
  40. Niazmand, K.; Tonn, K.; Zhao, Y.; Fietzek, U.M.; Schroeteler, F.; Ziegler, K.; Ceballos-Baumann, A.O.; Lueth, T.C. Freezing of Gait detection in Parkinson’s disease using accelerometer based smart clothes. In Proceedings of the 2011 IEEE Biomedical Circuits and Systems Conference (BioCAS), San Diego, CA, USA, 10–12 November 2011; pp. 201–204. [Google Scholar] [CrossRef]
  41. Zhao, Y.; Tonn, K.; Niazmand, K.; Fietzek, U.M.; D’Angelo, L.T.; Ceballos-Baumann, A.; Lueth, T.C. Online FOG identification in Parkinson’s disease with a time-frequency combined algorithm. In Proceedings of the IEEE-EMBS International Conference on Biomedical and Health Informatics, Hong Kong and Shenzhen, China, 2–7 January 2012; pp. 192–195. [Google Scholar] [CrossRef]
  42. Mazilu, S.; Hardegger, M.; Zhu, Z.; Roggen, D.; Troester, G.; Plotnik, M.; Hausdorff, J. Online detection of freezing of gait with smartphones and machine learning techniques. In Proceedings of the 6th International Conference on Pervasive Computing Technologies for Healthcare, San Diego, CA, USA, 21–24 May 2012; Arriaga, R., Matic, A., Eds.; IEEE: Piscataway, NJ, USA, 2012. [Google Scholar] [CrossRef]
  43. Tripoliti, E.E.; Tzallas, A.T.; Tsipouras, M.G.; Rigas, G.; Bougia, P.; Leontiou, M.; Konitsiotis, S.; Chondrogiorgi, M.; Tsouli, S.; Fotiadis, D.I. Automatic detection of freezing of gait events in patients with Parkinson’s disease. Comput. Methods Programs Biomed. 2013, 110, 12–26. [Google Scholar] [CrossRef]
  44. Moore, S.T.; Yungher, D.A.; Morris, T.R.; Dilda, V.; MacDougall, H.G.; Shine, J.M.; Naismith, S.L.; Lewis, S.J. Autonomous identification of freezing of gait in Parkinson’s disease from lower-body segmental accelerometry. J. Neuroeng. Rehabil. 2013, 10, 19:1–19:10. [Google Scholar] [CrossRef] [Green Version]
  45. Mazilu, S.; Calatroni, A.; Gazit, E.; Roggen, D.; Hausdorff, J.M.; Tröster, G. Feature learning for detection and prediction of freezing of gait in Parkinson’s disease. In Machine Learning and Data Mining in Pattern Recognition; Perner, P., Ed.; Springer: Berlin/Heidelberg, Germany, 2013; Volume 7988, pp. 144–158. [Google Scholar]
  46. Coste, C.A.; Sijobert, B.; Pissard-Gibollet, R.; Pasquier, M.; Espiau, B.; Geny, C. Detection of freezing of gait in Parkinson disease: Preliminary results. Sensors 2014, 14, 6819–6827. [Google Scholar] [CrossRef] [PubMed]
  47. Sijobert, B.; Denys, J.; Coste, C.A.; Geny, C. IMU based detection of freezing of gait and festination in Parkinson’s disease. In Proceedings of the 2014 IEEE 19th International Functional Electrical Stimulation Society Annual Conference, IFESS, Kuala Lumpur, Malaysia, 17–19 September 2014. [Google Scholar] [CrossRef]
  48. Kwon, Y.; Park, S.H.; Kim, J.W.; Ho, Y.; Jeon, H.M.; Bang, M.J.; Jung, G.I.; Lee, S.M.; Eom, G.M.; Koh, S.B.; et al. A practical method for the detection of freezing of gait in patients with Parkinson’s disease. Clin. Interv. Aging 2014, 9, 1709–1719. [Google Scholar] [CrossRef] [Green Version]
  49. Pepa, L.; Ciabattoni, L.; Verdini, F.; Capecci, M.; Ceravolo, M.G.G. Smartphone based fuzzy logic freezing of gait detection in Parkinson’s Disease. In Proceedings of the 10th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications, Senigallia, Italy, 10–12 September 2014. [Google Scholar] [CrossRef]
  50. Djuric-Jovicic, M.D.; Jovicic, N.S.; Radovanovic, S.M.; Stankovic, I.D.; Popovic, M.B.; Kostic, V.S. Automatic Identification and Classification of Freezing of Gait Episodes in Parkinson’s Disease Patients. IEEE Trans. Neural Syst. Rehabil. Eng. 2014, 22, 685–694. [Google Scholar] [CrossRef] [PubMed]
  51. Assam, R.; Seidl, T. Prediction of freezing of gait from parkinson’s disease movement time series using conditional random fields. In Proceedings of the 3rd ACM SIGSPATIAL International Workshop on the Use of GIS in Public Health, HealthGIS 2014—In Conjuncture with the 22nd ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems, ACM GIS 2014, Dallas, TX, USA, 4 November 2014; Kanza, Y., Gudes, O., Goldberg, D.W., Eds.; ACM: New York, NY, USA, 2014; pp. 11–20. [Google Scholar] [CrossRef]
  52. Mazilu, S.; Blanke, U.; Troster, G. Gait, wrist, and sensors: Detecting freezing of gait in Parkinson’s disease from wrist movement. In Proceedings of the IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops), St. Louis, MI, USA, 23–27 March 2015; pp. 579–584. [Google Scholar] [CrossRef]
  53. Zach, H.; Janssen, A.M.; Snijders, A.H.; Delval, A.; Ferraye, M.U.; Auff, E.; Weerdesteyn, V.; Bloem, B.R.; Nonnekes, J. Identifying freezing of gait in Parkinson’s disease during freezing provoking tasks using waist-mounted accelerometry. Park. Relat. Disord. 2015, 21, 1362–1366. [Google Scholar] [CrossRef] [PubMed]
  54. Kim, H.; Lee, H.J.; Lee, W.; Kwon, S.; Kim, S.K.; Jeon, H.S.; Park, H.; Shin, C.W.; Yi, W.J.; Jeon, B.S.; et al. Unconstrained detection of freezing of Gait in Parkinson’s disease patients using smartphone. In Proceedings of the 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 3751–3754. [Google Scholar] [CrossRef]
  55. Handojoseno, A.M.A.; Gilat, M.; Quynh Tran Ly, Q.T.; Chamtie, H.; Shine, J.M.; Nguyen, T.N.; Tran, Y.; Lewis, S.J.G.; Nguyen, H.T. An EEG study of turning freeze in Parkinson’s disease patients: The alteration of brain dynamic on the motor and visual cortex. In Proceedings of the 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 6618–6621. [Google Scholar] [CrossRef]
  56. Venu, N.; C, L.; Student, P.G. Prediction of freezing of gait (FOG) episodes in Parkinson’s disease patients by gait analysis. Int. J. Innov. Res. Sci. Eng. Technol. 2016, 5, 13679–13684. [Google Scholar] [CrossRef]
  57. Rodriguez-Martín, D.; Sama, A.; Pérez-López, C.; Catala, A.; Mestre, B.; Alcaine, S.; Bayes, À. Comparison of features, window sizes and classifiers in detecting freezing of gait in patients with Parkinson’s disease through a waist-worn accelerometer. In Proceedings of the 19th International Conference of the Catalan Association for Artificial Intelligence, Barcelona, Spain, 19–21 October 2016; Nebot, À., Binefa, X., de Mántaras, R.L., Eds.; IOS Press: Amsterdam, The Netherlands, 2016; pp. 127–136. [Google Scholar] [CrossRef]
  58. Mazilu, S.; Blanke, U.; Calatroni, A.; Gazit, E.; Hausdorff, J.M.; Tröster, G. The role of wrist-mounted inertial sensors in detecting gait freeze episodes in Parkinson’s disease. Pervasive Mob. Comput. 2016, 33, 1–16. [Google Scholar] [CrossRef]
  59. Lorenzi, P.; Rao, R.; Romano, G.; Kita, A.; Irrera, F. Mobile devices for the real-time detection of specific human motion disorders. IEEE Sens. J. 2016, 16, 8220–8227. [Google Scholar] [CrossRef]
  60. Lorenzi, P.; Rao, R.; Romano, G.; Kita, A.; Serpa, M.; Filesi, F.; Parisi, R.; Suppa, A.; Bologna, M.; Berardelli, A.; et al. Smart sensing systems for the detection of human motion disorders. Procedia Eng. 2015, 120, 324–327. [Google Scholar] [CrossRef] [Green Version]
  61. Lorenzi, P.; Rao, R.; Romano, G.; Kita, A.; Serpa, M.; Filesi, F.; Irrera, F.; Bologna, M.; Suppa, A.; Berardelli, A. Smart sensors for the recognition of specific human motion disorders in Parkinson’s disease. In Proceedings of the 2015 6th International Workshop on Advances in Sensors and Interfaces (IWASI), Gallipoli, Italy, 18–19 June 2015; pp. 131–136. [Google Scholar] [CrossRef]
  62. Kita, A.; Lorenzi, P.; Romano, G.; Rao, R.; Parisi, R.; Suppa, A.; Bologna, M.; Berardelli, A.; Irrera, F. Smart Sensing System for the Detection of Specific Human Motion Symptoms of the Parkinson’s Disease. In Proceedings of the 9th International Joint Conference on Biomedical Engineering Systems and Technologies, BIODEVICES, Rome, Italy, 21–23 February 2016; SCITEPRESS—Science and Technology Publications: Setúbal, Portugal, 2016; pp. 152–159. [Google Scholar] [CrossRef]
  63. Rezvanian, S.; Lockhart, T.E. Towards real-time detection of freezing of gait using wavelet transform on wireless accelerometer data. Sensors 2016, 16, 475. [Google Scholar] [CrossRef]
  64. Ahlrichs, C.; Samà, A.; Lawo, M.; Cabestany, J.; Rodríguez-Martín, D.; Pérez-López, C.; Sweeney, D.; Quinlan, L.R.; Laighin, G.Ò.; Counihan, T.; et al. Detecting freezing of gait with a tri-axial accelerometer in Parkinson’s disease patients. Med. Biol. Eng. Comput. 2016, 54, 223–233. [Google Scholar] [CrossRef] [Green Version]
  65. Capecci, M.; Pepa, L.; Verdini, F.; Ceravolo, M.G. A smartphone-based architecture to detect and quantify freezing of gait in Parkinson’s disease. Gait Posture 2016, 50, 28–33. [Google Scholar] [CrossRef] [PubMed]
  66. Ly, Q.T.; Handojoseno, A.M.A.; Gilat, M.; Nguyen, N.; Chai, R.; Tran, Y.; Lewis, S.J.G.; Nguyen, H.T. Identifying montages that best detect the electroencephalogram power spectrum alteration during freezing of gait in Parkinson’s disease patients. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS, Orlando, FL, USA, 16–20 August 2016; pp. 6094–6097. [Google Scholar] [CrossRef]
  67. Pham, T.T.; Moore, S.T.; Lewis, S.J.G.; Nguyen, D.N.; Dutkiewicz, E.; Fuglevand, A.J.; McEwan, A.L.; Leong, P.H.W. Freezing of gait detection in Parkinson’s disease: A subject-independent detector using anomaly scores. IEEE Trans. Biomed. Eng. 2017, 64, 2719–2728. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  68. Shine, J.M.; Moore, S.T.; Bolitho, S.J.; Morris, T.R.; Dilda, V.; Naismith, S.L.; Lewis, S.J.G. Assessing the utility of Freezing of gait questionnaires in Parkinson’s disease. Park. Relat. Disord. 2012, 18, 25–29. [Google Scholar] [CrossRef] [PubMed]
  69. Pham, T.T.; Nguyen, D.N.; Dutkiewicz, E.; McEwan, A.L.; Leong, P.H.W. Wearable healthcare systems: A single channel accelerometer based anomaly detector for studies of gait freezing in Parkinson’s disease. In Proceedings of the 2017 IEEE International Conference on Communications (ICC), Paris, France, 21–25 May 2017; pp. 1–5. [Google Scholar] [CrossRef]
  70. Ahn, D.; Chung, H.; Lee, H.-W.; Kang, K.; Ko, P.-W.; Kim, N.S.; Park, T. Smart gait-aid glasses for Parkinson’s disease patients. IEEE Trans. Biomed. Eng. 2017, 64, 2394–2402. [Google Scholar] [CrossRef]
  71. Tahafchi, P.; Molina, R.; Roper, J.A.; Sowalsky, K.; Hass, C.J.; Gunduz, A.; Okun, M.S.; Judy, J.W. Freezing-of-Gait detection using temporal, spatial, and physiological features with a support-vector-machine classifier. In Proceedings of the 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Seogwipo, Korea, 11–15 July 2017; pp. 2867–2870. [Google Scholar] [CrossRef]
  72. Suppa, A.; Kita, A.; Leodori, G.; Zampogna, A.; Nicolini, E.; Lorenzi, P.; Rao, R.; Irrera, F. L-DOPA and freezing of gait in Parkinson’s disease: Objective assessment through a wearable wireless system. Front. Neurol. 2017, 8, 406:1–406:14. [Google Scholar] [CrossRef]
  73. Kita, A.; Lorenzi, P.; Rao, R.; Irrera, F. Reliable and robust detection of freezing of gait episodes with wearable electronic devices. IEEE Sens. J. 2017, 17, 1899–1908. [Google Scholar] [CrossRef] [Green Version]
  74. Rodríguez-Martín, D.; Samà, A.; Pérez-López, C.; Català, A.; Moreno Arostegui, J.M.; Cabestany, J.; Bayés, À.; Alcaine, S.; Mestre, B.; Prats, A.; et al. Home detection of freezing of gait using support vector machines through a single waist-worn triaxial accelerometer. PLoS ONE 2017, 12, e0171764:1–e0171764:26. [Google Scholar] [CrossRef]
  75. Rodríguez-Martín, D.; Pérez-López, C.; Samà, A.; Català, A.; Arostegui, J.M.M.; Cabestany, J.; Mestre, B.; Alcaine, S.; Prats, A.; De La Cruz Crespo, M.; et al. A waist-worn inertial measurement unit for long-term monitoring of Parkinson’s disease patients. Sensors 2017, 17, 827. [Google Scholar] [CrossRef] [Green Version]
  76. Ly, Q.T.; Handojoseno, A.M.A.; Gilat, M.; Chai, R.; Martens, K.A.E.; Georgiades, M.; Naik, G.R.; Tran, Y.; Lewis, S.J.G.; Nguyen, H.T. Detection of turning freeze in Parkinson’s disease based on S-transform decomposition of EEG signals. In Proceedings of the 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Seogwipo, Korea, 11–15 July 2017; pp. 3044–3047. [Google Scholar] [CrossRef] [Green Version]
  77. Pepa, L.; Capecci, M.; Ciabattoni, L.; Spalazzi, L.; Ceravolo, M.G. An unobtrusive expert system to detect freezing of gait during daily living in people with Parkinson’s disease. In Proceedings of the 2017 2nd International Multidisciplinary Conference on Computer and Energy Science, Split, Croatia, 12–14 July 2017; Rodrigues, J.J.P.C., Nižetić, S., Patrono, L., Milanović, Z., Šolić, P., Vukojević, K., Perković, T., Eds.; IEEE: Piscataway, NJ, USA, 2017. [Google Scholar]
  78. Wang, K.; Li, B.; Gu, D.; Dai, K.; Zhou, L. A smartphone based system for freezing of gait monitoring for Parkinson’s disease patients. In Proceedings of the 9th IEEE International Conference on Communication Software and Networks, ICCSN 2017, Guangzhou, China, 6–8 May 2017; pp. 1529–1533. [Google Scholar] [CrossRef]
  79. Punin, C.; Barzallo, B.; Huerta, M.; Bermeo, A.; Bravo, M.; Llumiguano, C. Wireless devices to restart walking during an episode of FOG on patients with Parkinson’s disease. In Proceedings of the 2017 IEEE Second Ecuador Technical Chapters Meeting (ETCM), Salinas, Ecuador, 16–20 October 2017; pp. 1–6. [Google Scholar] [CrossRef]
  80. Saad, A.; Zaarour, I.; Guerin, F.; Bejjani, P.; Ayache, M.; Lefebvre, D. Detection of freezing of gait for Parkinson’s disease patients with multi-sensor device and Gaussian neural networks. Int. J. Mach. Learn. Cybern. 2017, 8, 941–954. [Google Scholar] [CrossRef]
  81. Samà, A.; Rodríguez-Martín, D.; Pérez-López, C.; Català, A.; Alcaine, S.; Mestre, B.; Prats, A.; Crespo, M.C.; Bayés, À. Determining the optimal features in freezing of gait detection through a single waist accelerometer in home environments. Pattern Recognit. Lett. 2018, 105, 135–143. [Google Scholar] [CrossRef]
  82. Prateek, G.V.; Skog, I.; McNeely, M.E.; Duncan, R.P.; Earhart, G.M.; Nehorai, A. Modeling, detecting, and tracking freezing of gait in Parkinson disease using inertial sensors. IEEE Trans. Biomed. Eng. 2018, 65, 2152–2161. [Google Scholar] [CrossRef] [PubMed]
  83. Ashour, A.S.; El-Attar, A.; Dey, N.; El-Naby, M.M.A.; El-Kader, H.A. Patient-dependent freezing of gait detection using signals from multi-accelerometer sensors in Parkinson’s disease. In Proceedings of the 9th Cairo International Biomedical Engineering Conference, CIBEC 2018, Cairo, Egypt, 20–22 December 2018; Yassine, I., Basha, T., Eldeib, .A., Eds.; IEEE: Piscataway, NJ, USA, 2019; pp. 171–174. [Google Scholar] [CrossRef]
  84. Roffo, G.; Melzi, S.; Cristani, M. Infinite Feature Selection. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 7–13 December 2015; pp. 4202–4210. [Google Scholar] [CrossRef]
  85. Camps, J.; Samà, A.; Martín, M.; Rodríguez-Martín, D.; Pérez-López, C.; Moreno Arostegui, J.M.; Cabestany, J.; Català, A.; Alcaine, S.; Mestre, B.; et al. Deep learning for freezing of gait detection in Parkinson’s disease patients in their homes using a waist-worn inertial measurement unit. Knowl.-Based Syst. 2018, 139, 119–131. [Google Scholar] [CrossRef]
  86. Oung, Q.W.; Basah, S.N.; Muthusamy, H.; Vijean, V.; Lee, H.; Khairunizam, W.; Bakar, S.A.; Razlan, Z.M.; Ibrahim, Z. Objective evaluation of freezing of gait in patients with Parkinson’s disease through machine learning approaches. In Proceedings of the 2018 International Conference on Computational Approach in Smart Systems Design and Applications, ICASSDA, Kuching, Malaysia, 15–17 August 2018. [Google Scholar] [CrossRef]
  87. Li, B.; Zhang, Y.; Tang, L.; Gao, C.; Gu, D. Automatic detection system for freezing of gait in Parkinson’s disease based on the clustering algorithm. In Proceedings of the 2nd IEEE Advanced Information Management, Communicates, Electronic and Automation Control Conference, IMCEC, Xi’an, China, 25–27 May 2018; Xu, B., Ed.; IEEE: Piscataway, NJ, USA, 2018; pp. 1640–1644. [Google Scholar] [CrossRef]
  88. Mikos, V.; Heng, C.-H.; Tay, A.; Chia, N.S.Y.; Koh, K.M.L.; Tan, D.M.L.; Au, W.L. Real-time patient adaptivity for freezing of gait classification through semi-supervised neural networks. In Proceedings of the 16th IEEE International Conference on Machine Learning and Applications, ICMLA 2017, Cancun, Mexico, 18–21 December 2017; Chen, X., Luo, B., Luo, F., Palade, V., Wani, M.A., Eds.; IEEE: Piscataway, NJ, USA, 2018; Volume 1, pp. 871–876. [Google Scholar] [CrossRef]
  89. Mikos, V.; Heng, C.-H.; Tay, A.; Yen, S.-C.; Chia, N.S.Y.; Koh, K.M.L.; Tan, D.M.L.; Au, W.L. A neural network accelerator with integrated feature extraction processor for a freezing of gait detection system. In Proceedings of the IEEE Asian Solid-State Circuits Conference, Tainan, Taiwan, 5–7 November 2018; pp. 59–62. [Google Scholar] [CrossRef]
  90. Mohammadian Rad, N.; van Laarhoven, T.; Furlanello, C.; Marchiori, E. Novelty detection using deep normative modeling for imu-based abnormal movement monitoring in parkinson’s disease and autism spectrum disorders. Sensors 2018, 18, 3533. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  91. El-Attar, A.; Ashour, A.S.; Dey, N.; El-Kader, H.A.; Abd El-Naby, M.M.; Shi, F. Hybrid DWT-FFT features for detecting freezing of gait in Parkinson’s disease. Front. Artif. Intell. Appl. 2019, 314, 117–126. [Google Scholar] [CrossRef]
  92. Punin, C.; Barzallo, B.; Clotet, R.; Bermeo, A.; Bravo, M.; Bermeo, J.P.; Llumiguano, C. A non-invasive medical device for parkinson’s patients with episodes of freezing of gait. Sensors 2019, 19, 737. [Google Scholar] [CrossRef] [Green Version]
  93. Barzallo, B.; Punin, C.; Llumiguano, C.; Huerta, M. Wireless assistance system during episodes of freezing of gait by means superficial electrical stimulation. In Proceedings of the World Congress on Medical Physics and Biomedical Engineering 2018, IFMBE, Prague, Czech Republic, 3–8 June 2018; Lhotska, L., Ed.; Springer: Singapore, 2019; Volume 68, pp. 865–870. [Google Scholar] [CrossRef]
  94. Mazzetta, I.; Zampogna, A.; Suppa, A.; Gumiero, A.; Pessione, M.; Irrera, F. Wearable sensors system for an improved analysis of freezing of gait in Parkinson’s disease using electromyography and inertial signals. Sensors 2019, 19, 948. [Google Scholar] [CrossRef] [Green Version]
  95. Mazilu, S.; Calatroni, A.; Gazit, E.; Mirelman, A.; Hausdorff, J.M.; Troster, G. Prediction of freezing of gait in Parkinson’s from physiological wearables: An exploratory study. IEEE J. Biomed. Heal. Informatics 2015, 19, 1843–1854. [Google Scholar] [CrossRef]
  96. Ardi Handojoseno, A.M.; Shine, J.M.; Nguyen, T.N.; Tran, Y.; Lewis, S.J.G.; Nguyen, H.T. Analysis and prediction of the freezing of gait using EEG brain dynamics. IEEE Trans. Neural Syst. Rehabil. Eng. 2015, 23, 887–896. [Google Scholar] [CrossRef]
  97. Zia, J.; Tadayon, A.; McDaniel, T.; Panchanathan, S. Utilizing neural networks to predict freezing of gait in Parkinson’s patients. In Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS 2016, Reno, NV, USA, 23–26 October 2016; ACM: New York, NY, USA, 2016; pp. 333–334. [Google Scholar] [CrossRef]
  98. Palmerini, L.; Rocchi, L.; Mazilu, S.; Gazit, E.; Hausdorff, J.M.J.M.; Chiari, L. Identification of characteristic motor patterns preceding freezing of gait in Parkinson’s disease using wearable sensors. Front. Neurol. 2017, 8, 394:1–394:12. [Google Scholar] [CrossRef] [Green Version]
  99. Handojoseno, A.M.A.; Naik, G.R.; Gilat, M.; Shine, J.M.; Nguyen, T.N.; Quynh, T.L.Y.; Lewis, S.J.G.; Nguyen, H.T. Prediction of freezing of gait in patients with Parkinson’s disease using EEG signals. In Telehealth for our Ageing Society; van den Berg, M.E.L., Maeder, A.L., Eds.; IOS Press: Amsterdam, The Netherlands, 2018; Volume 246, pp. 124–131. ISBN 9781614998440. [Google Scholar]
  100. Torvi, V.G.; Bhattacharya, A.; Chakraborty, S. Deep domain adaptation to predict freezing of gait in patients with Parkinson’s disease. In Proceedings of the 17th IEEE International Conference on Machine Learning and Applications, ICMLA 2018, Orlando, FL, USA, 17–18 December 2018; Wani, M.A., Sayed-Mouchaweh, M., Kantardzic, M., Gama, J., Lughofer, E., Eds.; IEEE: Piscataway, NJ, USA, 2019; pp. 1001–1006. [Google Scholar] [CrossRef]
  101. Mazilu, S.; Blanke, U.; Roggen, D.; Tröster, G.; Gazit, E.; Hausdorff, J.M. Engineers meet clinicians. In Proceedings of the 4th Augmented Human International Conference, Stuttgart, Germany, 7–8 March 2013; ACM: New York, NY, USA, 2013; pp. 124–127. [Google Scholar] [CrossRef]
  102. REMPARK—Personal Health Device for the Remote and Autonomous Management of Parkinson’s Disease. Available online: https://rempark.cetpd.upc.edu/project (accessed on 2 October 2019).
  103. Verleysen, M.; Pérez López, C.; Rodríguez Martín, D.M.; Cabestany Moncusí, J.; Moreno Aróstegui, J.M.; Rodríguez Molinero, A. A heterogeneous database for movement knowledge extraction in Parkinson’s Disease. In Proceedings of the European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning: ESANN 2013, Bruges, Belgium, 24–26 April 2013; pp. 413–418. [Google Scholar]
  104. MASPARK. Available online: https://www.aicos.fraunhofer.pt/en/our_work/projects/maspark.html (accessed on 10 May 2019).
  105. Niazmand, K.; Tonn, K.; Kalaras, A.; Fietzek, U.M.; Mehrkens, J.H.; Lueth, T.C. Quantitative evaluation of Parkinson’s disease using sensor based smart glove. In Proceedings of the 24th International Symposium on Computer-Based Medical Systems (CBMS), Bristol, UK, 27–30 June 2011; IEEE: Piscataway, NJ, USA, 2011; pp. 1–8. [Google Scholar] [CrossRef]
  106. Nilsson, J.-O.; Gupta, A.K.; Handel, P. Foot-mounted inertial navigation made easy. In Proceedings of the 2014 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Busan, Korea, 27–30 October 2014; IEEE: Piscataway, NJ, USA, 2015; pp. 24–29. [Google Scholar] [CrossRef]
  107. Zijlstra, W.; Hof, A.L. Assessment of spatio-temporal gait parameters from trunk accelerations during human walking. Gait Posture 2003, 18, 1–10. [Google Scholar] [CrossRef] [Green Version]
  108. Laudanski, A.; Yang, S.; Li, Q. A concurrent comparison of inertia sensor-based walking speed estimation methods. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, MA, USA, 30 August–3 September 2011; pp. 3484–3487. [Google Scholar] [CrossRef]
  109. Madgwick, S.O.H.; Harrison, A.J.L.; Vaidyanathan, R. Estimation of IMU and MARG orientation using a gradient descent algorithm. In Proceedings of the IEEE International Conference on Rehabilitation Robotics, Zurich, Switzerland, 29 June–1 July 2011; pp. 1–7. [Google Scholar] [CrossRef]
  110. Cole, B.T.; Roy, S.H.; De Luca, C.J.; Nawab, S.H. Dynamic neural network detection of tremor and dyskinesia from wearable sensor data. In Proceedings of the Proceeding of the Annual International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 31 August–4 September 2010; pp. 6062–6065. [Google Scholar] [CrossRef]
  111. Moore, S.T.; MacDougall, H.G.; Gracies, J.M.; Cohen, H.S.; Ondo, W.G. Long-term monitoring of gait in Parkinson’s disease. Gait Posture 2007, 26, 200–207. [Google Scholar] [CrossRef] [PubMed]
  112. Khan, A.M.; Lee, Y.K.; Kim, T.-S. Accelerometer signal-based human activity recognition using augmented autoregressive model coefficients and artificial neural nets. In Proceedings of the 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vancouver, BC, Canada, 20–25 August 2008; pp. 5172–5175. [Google Scholar] [CrossRef]
  113. Kaminiski, M.J.; Blinowska, K.J. A new method of the description of the information flow sources. Biol. Cybern. 1991, 65, 203–210. [Google Scholar] [CrossRef] [PubMed]
  114. Koopman, B.O. Hamiltonian Systems and Transformation in Hilbert Space. Proc. Natl. Acad. Sci. USA 1931, 17, 315–318. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  115. Hua, J.C.; Roy, S.; McCauley, J.L.; Gunaratne, G.H. Using dynamic mode decomposition to extract cyclic behavior in the stock market. Phys. A Stat. Mech. Appl. 2016, 448, 172–180. [Google Scholar] [CrossRef]
  116. Phinyomark, A.; Limsakul, C.; Phukpattaranont, P. A Novel Feature Extraction for Robust EMG Pattern Recognition. J. Comput. 2009, 1, 71–80. [Google Scholar]
  117. Phinyomark, A.; Nuidod, A.; Phukpattaranont, P.; Limsakul, C. Feature extraction and reduction of wavelet transform coefficients for EMG pattern classification. Electron. Electr. Eng. 2012, 122, 27–32. [Google Scholar] [CrossRef] [Green Version]
  118. Challis, R.E.; Kitney, R.I. Biomedical signal processing (in four parts) Part 3 The power spectrum and coherence function. Med. Biol. Eng. 1991, 29, 225–241. [Google Scholar] [CrossRef]
  119. Tzallas, A.T.; Tsipouras, M.G.; Rigas, G.; Tsalikakis, D.G.; Karvounis, E.C.; Chondrogiorgi, M.; Psomadellis, F.; Cancela, J.; Pastorino, M.; Waldmeyer, M.T.A.; et al. PERFORM: A system for monitoring, assessment and management of patients with Parkinson’s disease. Sensors 2014, 14, 21329–21357. [Google Scholar] [CrossRef] [Green Version]
  120. Shiavi, R. Introduction to Applied Statistical Signal Analysis: Guide to Biomedical and Electrical Engineering Applications, 3rd ed.; Academic Press: Burlington, NJ, USA, 2007. [Google Scholar]
  121. Vinck, M.; Oostenveld, R.; Van Wingerden, M.; Battaglia, F.; Pennartz, C.M.A. An improved index of phase-synchronization for electrophysiological data in the presence of volume-conduction, noise and sample-size bias. Neuroimage 2011, 55, 1548–1565. [Google Scholar] [CrossRef]
  122. Lachaux, J.-P.; Rodriguez, E.; Martinerie, J.; Varela, F.J. Measuring phase synchrony in brain signals. Hum. Brain Mapp. 1999, 8, 194–208. [Google Scholar] [CrossRef] [Green Version]
  123. Tay, A.; Yen, S.C.; Lee, P.Y.; Wang, C.Y.; Neo, A.; Phan, S.W.; Yogaprakash, K.; Liew, S.; Au, W.L. Freezing of gait (FoG) detection for Parkinson disease. In Proceedings of the 10th Asian Control Conference (ASCC), Kota Kinabalu, Malaysia, 31 May–3 June 2015; pp. 1–6. [Google Scholar] [CrossRef]
  124. Wu, X.; Kumar, V.; Ross, Q.J.; Ghosh, J.; Yang, Q.; Motoda, H.; McLachlan, G.J.; Ng, A.; Liu, B.; Yu, P.S.; et al. Top 10 algorithms in data mining. Knowl. Inf. Syst. 2008, 14, 1–37. [Google Scholar] [CrossRef] [Green Version]
  125. Bishop, C.M. Pattern Recognition and Machine Learning, 1st ed.; Springer-Verlag: New York, NY, USA, 2006; ISBN 9780387310732. [Google Scholar]
  126. Friedman, J.; Hastie, T.; Tibshirani, R. Additive logistic regression: A statistical view of boosting. Ann. Stat. 2000, 28, 337–407. [Google Scholar] [CrossRef]
  127. Seiffert, C.; Khoshgoftaar, T.M.; Van Hulse, J.; Napolitano, A. RUSBoost: Improving classification performance when training data is skewed. In Proceedings of the 19th International Conference on Pattern Recognition, Tampa Bay, FL, USA, 8–11 December 2008; pp. 1–4. [Google Scholar] [CrossRef]
  128. Freund, Y. A More Robust Boosting Algorithm. arXiv 2009, arXiv:0905.2138. [Google Scholar]
  129. Egmont-petersen, M.; De Ridder, D.; Handels, H. Image processing with neural networks—A review. Patt. Recognit. 2002, 35, 2279–2301. [Google Scholar] [CrossRef]
  130. Yamashita, R.; Nishio, M.; Do, R.K.G.; Togashi, K. Convolutional neural networks: An overview and application in radiology. Insights Imaging 2018, 9, 611–629. [Google Scholar] [CrossRef] [Green Version]
  131. Anwar, S.M.; Majid, M.; Qayyum, A.; Awais, M.; Alnowami, M.; Khan, M.K. Medical Image Analysis using Convolutional Neural Networks: A Review. J. Med. Syst. 2018, 42, 1–13. [Google Scholar] [CrossRef] [Green Version]
  132. Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
  133. Sak, H.; Senior, A.; Beaufays, F. Long Short-Term Memory Based Recurrent Neural Network Architectures for Large Vocabulary Speech Recognition. arXiv 2014, arXiv:1402.1128. [Google Scholar]
  134. Heremans, E.; Nieuwboer, A.; Vercruysse, S. Freezing of gait in Parkinson’s disease: Where are we now? Curr. Neurol. Neurosci. Rep. 2013, 13, 350:1–350:9. [Google Scholar] [CrossRef]
  135. Howcroft, J.; Kofman, J.; Lemaire, E.D. Prospective fall-risk prediction models for older adults based on wearable sensors. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 1812–1820. [Google Scholar] [CrossRef]
  136. Howcroft, J.; Lemaire, E.D.; Kofman, J. Wearable-sensor-based classification models of faller status in older adults. PLoS ONE 2016, 11, e0153240:1–e0153240:16. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  137. Caby, B.; Kieffer, S.; de Saint Hubert, M.; Cremer, G.; Macq, B. Feature extraction and selection for objective gait analysis and fall risk assessment by accelerometry. Biomed. Eng. Online 2011, 10, 1–19. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  138. Shany, T.; Redmond, S.J.; Narayanan, M.R.; Lovell, N.H. Sensors-based wearable systems for monitoring of human movement and falls. IEEE Sens. J. 2012, 12, 658–670. [Google Scholar] [CrossRef]
  139. Saeys, Y.; Inza, I.; Larranaga, P. A review of feature selection techniques in bioinformatics. Bioinformatics 2007, 23, 2507–2517. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Figure 1. Diagram of article selection process.
Figure 1. Diagram of article selection process.
Sensors 19 05141 g001
Table 1. Characteristics of FOG detection studies and FOG prediction studies, using wearable sensors.
Table 1. Characteristics of FOG detection studies and FOG prediction studies, using wearable sensors.
SourceStudied PopulationWalking Task PerformedSensor Type and LocationFOG Detection MethodFeaturesClassifier PerformanceReal Time
Moore 2008 [29]11 FOG-PD (7 froze), ON and OFF, 46 episodesLab, straight walking, 180° turns, narrow doorways, obstacle avoidance.IMU (1) left shankFreeze index (FI) with person- specific thresholds. 6 s windows, detection based on FOG episode occurrences.EDetected 89.1% of episode occurrences, 10% false positivesNo
Zabaleta 2008 [30]4 FOG-PD, ON and OFFLab, sit to stand, 90° and 180° turns, figure-eight, doorway navigation, obstacle avoidance.IMU (6) heels, shanks, thighsMultivariate linear discriminant analysis, frequency-based features. Person-specific, detection based on classification of individual 3 s windows.EArea under ROC curve. Average of all participants: 0.937No
Jovanov 2009 [23]4 HC, 1 UFOG-PDLab, sit to stand and walking.IMU (1) right kneeFI [29], 0.32 s windows (64 samples at 200 Hz).E-Yes
Bachlin 2009–2010 [24,31,32,33] *10 FOG-PD (8 froze), 237 episodesLab, straight walking, 180° turns, random instructions to start, stop, and turn 360° in both directions. Simulated ADL (walk to room, return with glass of water)Acc (3) left shank, left thigh, lower backFI [29] with additional energy threshold to reduce false positives due to standing. 4 s windows with 0.5 s shift each step. Detection performance based on classification of windows with a 2 s tolerance.EPerson-independent threshold:
Sensitivity: 73.1%
Specificity: 81.6%
Yes
Bachlin 2009 [34] *10 FOG-PD (8 froze) 237 episodesLab, straight walking, 180° turns, randomly given instructions and simulated ADL.Acc (3) left shank, left thigh, lower backSame methods as [24]. Improved offline through person-specific thresholds. Detection performance based on classification of windows with a 2 s tolerance.ESensitivity: 88.6%
Specificity: 92.8%
No
Delval 2010 [35]10 HC, 10 NFOG-PD, 10 FOG-PD (5 froze), OFF, 20 episodesLab, 2 km/h treadmill, objects unexpectedly dropped on belt in front of participant.CBMC, goniometers (2) kneesCompared stride features (e.g., step duration, step distance), and FI to person-independent thresholds, using 4.1 s windows.ESensitivity: 75–83%
Specificity: >95%
No
Djuric-Jovicic 2010 [36]4 FOG-PDLab, sit to stand, straight walking through doorway, 180° turn, return to seat.IMU (6) feet, shanks, thighsEnergy thresholds to detect movement, combined with NN for FOG detection. 0.2 s and 1.0 s windows. Classification performance based on number and duration of false detections.EClassification error up to 16%No
Popovic 2010 [37]9 FOG-PD (7 froze), ON, 24 episodesLab, sit to stand, straight walking through doorway, 180° turn, return to seat.FSR in-shoe insole, Acc (6) feet, shanks, thighsFSR signals to create single person-specific “normal step”. Pearson’s correlation coefficient (PCC) calculated for FSR signal of entire trial, then compared to a threshold.E-No
Cole 2011 [38]2 HC, 10 UFOG-PD, 107 episodesLab, unscripted ADL in mock apartment.Acc (3) shin, thigh, forearm, EMG (1) shinStand vs sit detection, NN for FOG detection. Person-independent model, 2 s windows, detection performance calculated per 1 s segments.ESensitivity: 82.9%
Specificity: 97.3%
No
Tsipouras 2011 [39]5 HC, 6 NFOG-PD, 5 FOG-PD-Acc (6) wrists, legs, chest, waist, Gyro (2) chest, waistC4.5 decision tree, random forest, using 2 s windows.EAccuracy:
Decision tree: 95.08%
Random forest 96.11%
No
Niazmand 2011 [40]6 FOG-PD (varying severity)Lab, walk with 180° turns, with and without walking aid. Walking, 180° and 360° turns (both directions), doorways.Instrumented pants, Acc (5) waist, thighs, shanksMulti-stage, person-independent, threshold-based classification, identifies suspicious movement, then frequency feature for classification, using 2 s windows.ESensitivity: 88.3%
Specificity: 85.3%
No
Zhao 2012 [41]8 FOG-PD (6 froze), 82 episodesLab, 5-8 min random instructions (stand, walk, stop, turn).Instrumented pants, Acc (5) waist, thighs, shanks (as in [40])Time series, acceleration peaks detection (1.5 s windows) and frequency features via FFT (4 s windows), compared to person-independent thresholds.ESensitivity: 81.7%No
Mazilu 2012 [42] *10 FOG-PD (8 froze), 237 episodesLab, straight walking, 180° turns, randomly given instructions and simulated ADL.
Acc (3) left shank, left thigh, lower backAdaBoosted decision tree classifier best among several. Compared window sizes 1–4 s, 1 s was ideal. Detection performance based on classification of individual windows.EPerson-specific:
Sensitivity: 98.35% Specificity: 99.72%
Person-independent:
Sensitivity: 66.25%
Specificity: 95.38%
No
Tripoliti 2013 [43]5 HC, 6 NFOG-PD, 5 FOG-PD, ON and OFF, 93 episodesLab, rise from bed, walking tasks including doorways, 180° turns, and ADL.Acc (4) ankles, wrists, IMU (2) waist, chestRandom forest classifier, 1 s windows. Person-independent detection performance based on classification of individual windows.ESensitivity: 81.94%
Specificity: 98.74%
No
Moore 2013 [44]25 FOG-PD (20 froze), OFF, 298 episodesLab, TUG.IMU (7) Lower back, thighs, shanks, feetFI thresholds [29]. Compared different sensor locations, person-independent thresholds and window lengths. Detection performance based on classification of FOG episode occurrences and percentage of time frozen.ELower back sensor,
10 s window:
Sensitivity: 86.2%
Specificity: 82.4%
No
Mazilu 2013 [45] *10 FOG-PD (8 froze), 237 episodesLab, straight walking, 180° turns, random instructions and simulated ADL.Acc (3) left shank, left thigh, lower backPerson-specific decision tree, tested different feature sets and supervised vs unsupervised feature selection using principal component analysis (PCA). Detection performance based on classification of individual 1 s windows.E, SUnsupervised:
Sensitivity: 77.7%
Specificity: 87.56%
Supervised:
Sensitivity: 69.42%
Specificity: 87.76%
No
Coste 2014 [46]4 UFOG-PD, 44 episodesLab, corridor walk with dual task.IMU (1) shankFreezing of gait criterion (FOGC) feature, based on cadence and stride length, incorporating person-specific thresholds. Detection performance based on classification of FOG episode occurrences.ESensitivity: 79.5%No
Sijobert 2014 [47]7 UFOG-PD, 50 episodesLab, corridor walk with dual task.IMU (1) shankFOGC [46], with person-specific thresholds. Detection performance based on classifying FOG episode occurrences. FOG episodes labeled as Green (n = 19, slight gait modification with no fall risk), Orange (n = 12, gait modification with fall risk) or red (n = 19, FOG – blocked gait).ECorrectly identified 26 of 31 FOG (orange and red)No
Kwon 2014 [48]20 FOG-PD (6 froze), ON, 36 episodesLab, repeated straight walk with 180° turns.Acc (1) in shoe heelRoot mean square (RMS) of acceleration compared to person-specific threshold. 0.2–10 s windows. 3–4 s windows recommended.EMinimum of sensitivity or specificity: 85.8%No
Pepa 2014 [49]18 UFOG-PD, ONLab, 3 TUG variations: standard, with cognitive dual task, with manual dual task.Acc (1) smartphone worn on belt at hipFuzzy logic model using frequency features, person-specific thresholds, 2.56 s windows. Detection performance based on classification of windows (sensitivity, specificity) and FOG episode occurrences (sensitivity) – distinction not indicated in results.ESensitivity: 89%
Specificity: 97%
No
Djuric-Jovicic 2014 [50]12 FOG-PD, OFFLab, sit to stand, walk with 90° and 180° turns, multiple doorways.IMU (2) shanks, FSR in-shoe insolesEach stride is compared to a “normal” stride using spectral power, stride duration, and shank displacement. Custom rule-based method classified each stride based on person-specific thresholds.EFOG with tremor:
Sensitivity: 99%
Specificity: 100%
FOG complete stop:
Sensitivity: 100% Specificity: 100%
No
Assam 2014 [51] *10 FOG-PD (8 froze), 237 episodesLab, straight walking, 180° turns, random instructions and simulated ADL.Acc (3) left shank, left thigh, lower backWavelet decomposition for feature extraction and conditional random fields for classification. Train/test for each person individually (person-specific model), compared 2.5, 4 and 8 s windows. Results for 3 participants, separately.E, SBest single participant results, with 4s window:
Sensitivity: 65%
Precision: 61.9%
No
Mazilu 2014 [25]5 FOG-PD, 102 episodesLab, walking with turns and doorways.IMU (2) anklesPerson-independent decision tree classifier (C4.5), multiple frequency-based input features, 2 s windows. Detection performance based on classifying FOG episode occurrences.E99 of 102 FOG detectedYes
Mazilu 2015 [52] **18 FOG-PD (11 froze), 182 episodesLab, walking tasks with cognitive and manual tasks.
Straight walking, 180° and 360° turns, narrow spaces, hospital circuit with elevator, unexpected stops start, and turns.
IMU (2) wristsDecision tree classifier (C4.5), features from wrist data, 3 s windows, person-specific detection performance based on classifying FOG episode occurrences.EPerson-specific:
Sensitivity: 90%
Specificity: 83%
No
Zach 2015 [53]23 FOG-PD (16 froze), OFF, 166 episodesLab, self-paced, fast walking, short steps, short fast steps, 360° turns both directions.Acc (1) lower backFI [29] compared to person-specific and person-independent thresholds, 2 s windows, detection performance based on classifying FOG episode occurrences.EPerson-independent threshold:
Sensitivity: 75%
Specificity: 76%
No
Kim 2015 [54]15 FOG-PD (9 froze), 46 episodesLab, hospital hallway, straight walk with 180° turns, also with dual tasks.IMU (1) (smartphone) ankle, pants pocket, chest pocket, waistAdaboosted, person-independent, decision tree using 4 s windows. Compared different sensor locations, found waist best.ESmartphone on waist:
Sensitivity: 86%
Specificity: 91.7%
No
Handojoseno 2015 [55]4 FOG-PD, OFFLab, TUG with 180° or 540° turns in both directions.EEG, headPerson-independent NN to detect FOG during turning, 0.256 s windows, 1 s samples (117 normal turning, 224 FOG turning).E, SSensitivity: 74.6%
Specificity: 48.4%
No
Venu 2016 [56] *10 FOG-PD (8 froze), 237 episodesLab, straight walking, 180° turns, random instructions and simulated ADL.Acc (3) left shank, left thigh, lower backWavelet decomposition used sub-band energies as features, continuous random field used for detection. 4 s windows.
Person-independent detection performance based on classifying FOG episode occurrences.
E, SAverage of 3 participants test set:
Sensitivity: 90.3%
Precision: 95.8%
No
Martin 2016 [57] ****6 FOG-PD, ON and OFFParticipant’s home, 180° turns, doorways, walking outside, dual tasking and false positive test intended to create shaking resembling FOG (e.g., brushing teeth).Acc (1) left hipDifferent methods, feature sets, and window sizes compared. Best results from SVM. Detection performance based on classification of individual 1.6 s windows.ESensitivity: 91.7%
Specificity: 87.4%
No
Mazilu 2016 [58] **18 FOG-PD (11 froze), 184 episodesLab, walking tasks with cognitive and manual tasks.
Straight walking, 180° and 360° turns, narrow spaces and hospital circuit with elevator, unexpected stops start, and turns.
IMU (2) wristsDecision tree classifier (C4.5) similar to [52], but fewer features and evaluation of single wrist input. 3 s windows, detection performance based on classifying FOG episode occurrences.E, SPerson-specific:
Sensitivity: 85%
Specificity: 80%
Person-independent:
Sensitivity: 90%
Specificity: 66%
No
Lorenzi 2016 [59,60,61,62]16 UFOG-PDLab, walking through doorway, 180° turns.IMU (2) shanks, IMU (1) side of headCompared headset (combined with NN) and shin mounted IMUs. Shin method using custom k-index feature compared to person specific thresholds performed best.EFrom shin system:
Sensitivity: 94.5%
Specificity: 96.7%
No
Rezvanian 2016 [63] *10 FOG-PD (8 froze), 237 episodesLab, straight walking, 180° turns, random instructions and simulated ADL.Acc (3) left shank, left thigh, lower backContinuous wavelet transform computed ratio of frequency ranges, compared to person-independent threshold. Compared different window lengths, suggested 2 s windows for future real-time implementation.EWindow 2 s:
Sensitivity: 82.1%
Specificity: 77.1%
Window 4 s:
Sensitivity: 84.9%
Specificity: 81.01%
No
Ahlrichs 2016 [64] ***20 FOG-PD (8 froze) ON and OFF, 209 episodesParticipant’s home, 180° turns, doorways, walking outside, dual tasking and a false positive test intended to create shaking resembling FOG (e.g., brushing teeth).Acc (1) waistPerson-independent SVM (linear kernel), best results with 3.2 s windows. Classified windows aggregated over 60 s and degree of confidence calculated and compared to threshold to determine whether a FOG episode was present during aggregation period.ESensitivity: 92.3%
Specificity: 100%
No
Capecci 2016 [65]20 FOG-PD (16 froze), ON, 98 episodesLab, TUG test, cognitive or manual dual task.IMU (1) smartphone at waistCadence and modified freeze index extracted and compared to person-specific thresholds. Detection performance based on classification of individual 3.56 s windows.ESensitivity: 87.57%
Specificity: 94.97%
No
Ly 2016 [66]7 FOG-PD, OFFLab, TUG.EEG, headPerson-independent NN, compared different features and number of EEG channel inputs. Data divided into 1 s segments (343 effective walking and 343 freezing).E, SUsing all 32 channels:
Sensitivity: 72.2%
Accuracy: 71.46%
No
Pham 2017 [20] *10 FOG-PD (8 froze), 237 episodesLab, straight walking, 180° turns, random instructions and simulated ADL.Acc (3) left shank, left thigh, lower backAnomaly detection approach. Acceleration and spectral coherence features calculated for incoming window and “normal” reference. Person-independent thresholds used to classify FOG, “normal” reference updated with each non-FOG window. Detection performance based on classification of individual 0.6 s windows.ESensitivity: 87%
Specificity: 94%
No
Pham 2017 [67] *Development: 10 FOG-PD (8 froze),
Test: 24 FOG-PD (OFF)
Lab, straight walking, 180° turns, random instructions and simulated ADL.
Test: TUG, 180° and 540° turns in both directions.
Acc (3) left shank, left thigh, lower back
IMU (7) foot, shank, thigh, lower back/hip
Development data from Daphnet*, test data from [68]. Several new features (including multichannel freeze index) presented and evaluated, detection used anomaly score compared to person-independent threshold to classify individual 3 s windows.E, SFreeze index using hip sensor X-axis:
Sensitivity: 89%
Specificity: 94%
No
Pham 2017 [69] *10 FOG-PD (8 froze), 237 episodesLab, straight walking, 180° turns, random instructions and simulated ADL.Acc (3) left shank, left thigh, lower backFreezing index and spectral coherence features used to generate average value used as threshold for FOG detection. Participant independent averages automatically updated during use. Detection performance based on classification of 0.6 s windows.ESensitivity: 89.2%
Specificity: 95.6%
No
Ahn 2017 [70]10 HC, 10 FOG-PD, OFF, 42 episodesLab, TUG and 10 m walk tests.IMU (1) in smart glassesCustom FOG detection on glasses feature (FOGDOG), incorporated stride length and cadence, with person-specific thresholds, 1 s windows. Detection performance based on classifying FOG episode occurrencesEFor PD participants:
Sensitivity: 97%
Specificity: 88%
Yes
Tahafchi 2017 [71]2 FOG-PDLab, 6 min of walking turning and stepping in place.EMG + IMU units (6) thighs, shanks, feetSVM with Gaussian kernel, multiple time series and frequency features. 1 s windows.ESensitivity: 90%
Specificity: 92%
No
Suppa 2017 [72]28 FOG-PD (25 froze), 152 episodes (102 OFF, 50 ON)Lab, simulated home environment, TUG passing into narrow hall, turning both directions.IMU (2) shinsk index from shin-mounted sensor compared to person-specific thresholds [59], with additional analysis of ON vs. OFF states.ESensitivity: 93.41%
Specificity: 98.51%
No
Kita 2017 [73]32 UFOG-PD (25 froze)Lab, straight walking, through doorway, with 180° turn, and return.IMU (2) shanksImprovements on k index in [59], including new Kswing, K’ features. Person-specific performance based on percentage of time frozen per trial.ESensitivity: 93.41%
Specificity: 97.57%
No
Rodriguez-Martin 2017 [74] ***, ****21 FOG-PD, ON and OFF, 1321 episodesParticipant’s home, 180° turns, doorways, walking outside, dual tasking and a false positive test intended to create shaking resembling FOG (e.g., brushing teeth).IMU (1) left hipSVM (radial basis function kernel), compared person-independent and person-specific models, using 3.2 s windows. Detection performance based on classifying FOG episode occurrences.EPerson-independent:
Sensitivity: 74.7%
Specificity: 79.0%
Person-specific:
Sensitivity: 88.09%
Specificity: 80.09%
No
Rodriguez-Martin 2017 [75] ***, ****12 PD-FOG, 106 episodesParticipant’s home, 180° turns, doorways, walking outside, dual tasking and a false positive test intended to create shaking resembling FOG (e.g., brushing teeth).IMU (1) left hipSame detection algorithm as [74], also using 3.2 s windows. Detection performance based on classifying FOG episode occurrences.ESensitivity: 82.08%
Specificity: 97.19%
Yes
Ly 2017 [76]6 FOG-PDLab, TUG.EEG, headPerson-independent Bayesian NN, to detect FOG during turns. Similar to [55], with addition of S-transform. Data divided into 1 s samples (204 normal turning, 204 FOG turning).E, SSensitivity: 84.2%
Specificity: 88.0%
No
Pepa 2017 [77]20 UFOG-PDLab, TUG, with cognitive or manual dual task, sit, lay on bed, stand up and maintain upright posture, and run on a treadmill if able.IMU (1) smartphone at waistFuzzy inference system compared to person-specific thresholds to detect periods of walking and FOG. 2.56 s windows (256 samples at 100 Hz). Detection performance based on classifying FOG episode occurrences, duration of FOG also examined.EFOG detection performance using ANOVA.Yes
Wang 2017 [78]9 UFOG-PD, OFFLab, gait initialization, narrow aisle, turning and dual tasks. One participant performed ADL in their home.Acc (1) lower backFI and RMS of acceleration. Both compared to person-specific thresholds and combined with an OR statement. Detection performance calculated as percent time frozen per trial.ESensitivity: 90.8%
Specificity: 91.4%
No
Punin 2017 [79]1 HC, 1 NFOG-PD, 6 FOG-PD, OFF, 27 episodesLab, stair climb and descent, straight walking and 180° turns.IMU (1) right ankleDiscrete wavelet transform, compared to person-independent threshold. Detection performance based on classifying FOG episode occurrences.ESensitivity: 86.66%
Specificity: 60.61%
Yes
Saad 2017 [80]5 FOG-PD ON, 64 episodesLab, straight walking, 180° turn, manual dual task or narrowed walking path. Clinic circuit including unscripted stops, starts, turns and doorways.Acc (2) foot, shin, Goniometer (1) knee, Telemeters (IR proximity sensors) (2) upper and lower medial shankTime and frequency domain features extracted from 2 s windows. Best features for each sensor identified. Person-independent, NN with Gaussian activation function used for detection.
Defined average performance as mean of the fraction of FOG correctly identified and the fraction of non-FOG correctly identified.
E, SAverage of all participants:
Performance: 87%
No
Sama 2018 [81] ****15 FOG-PD, ON and OFFParticipant’s home, 180° turns, doorways, walking outside, dual tasking and a false positive test intended to create shaking resembling FOG (e.g. brushing teeth).IMU (1) left hipCompared multiple classifiers and feature sets, best results with SVM, using 1.6 s windows (64 samples at 40 Hz). Person-independent detection performance based on classifying FOG episode occurrencesESensitivity: 91.81%
Specificity: 87.45%
No
Prateek 2018 [82]16 UFOG-PD (8 froze), 58 episodesLab, walking backwards, 180° turns, stepping over a board, walk a figure-eight loop, walk between sets of chairs placed close together.IMU (2) heelsDetect instances of zero velocity or trembling, then, a point process filter computed probability of FOG based on foot position, orientation, and velocity. Detection performance based on classifying FOG episode occurrences, duration of FOG also examined.EPerson-specific model, detected 47/58 FOG episode occurrences.
Accuracy: 81.03%
No,
Ashour 2018 [83] *4 participants from DaphnetLab, straight walking, 180° turns, random instructions and simulated ADL.Acc (3) left shank, left thigh, lower backSVM (linear kernel). Used infinite feature ranking [84] to reduce feature set. Person-specific detection performance based on classifying FOG episode occurrences.E, S1 patient top ranked (30 features) Accuracy: 94.4%No
Camps 2018 [85] ****21 FOG-PD, ON and OFFParticipant’s home, 180° turns, doorways, walking outside, dual tasking and a false positive test intended to create shaking resembling FOG (e.g., brushing teeth).IMU (1) left hip1D CNN, 2.56 s windows stacked to combine current and previous windows. Person-independent detection performance based on classification of windows. Replicated other FOG detection methods and compared performance of models and feature sets.-CNN:
Sensitivity: 91.9%
Specificity: 89.5%
No
Oung 2018 [86] *10 FOG-PD (8 froze), 237 episodesLab, straight walking, 180° turns, random instructions and simulated ADL.Acc (3) left shank, left thigh, lower backProbabilistic NN, using time domain features (117) and frequency features (126), 4 s windows. Also examined SVM with RBF kernel. Person-specific and person-independent models compared.E, SPerson-specific:
Sensitivity: 99.83%
Specificity: 99.96%
Person-independent:
Sensitivity: 87.71%
Specificity: 87.38%
No
Li 2018 [87]10 FOG PD, OFF, 281 episodesLab, straight walking (10 m and 100 m), 180° turns, narrow spaces.Acc (1) lower backPerson-independent, unsupervised approach (training data not labelled). Mini batch k means clustering algorithm using acceleration entropy, 1 s windows. Once the centre of the FOG and non-FOG classes were found, new data were classified based on which centre was closest.ESensitivity: 92.4%
Specificity: 94.9%
No
Mikos 2018 [88,89]25 people, no other description provided (23 froze), 221 episodesLab, TUG and random walking.IMU (2) anklesSemi-supervised approach. NN, base training person-independent. Then unsupervised training during use improved performance.ESensitivity: 95.9%
Specificity: 93.1%
Yes
Rad 2018 [90] *10 FOG-PD (8 froze), 237 episodesLab, straight walking, 180° turns, random instructions and simulated ADL.Acc (3) left shank, left thigh, lower backProbabilistic anomaly detection approach using denoising autoencoder. Person-independent model trained to recognize normal gait (trained using non-FOG data), 1 s windows. Compared CNN trained using non-FOG (unsupervised) and FOG (supervised) data for comparison.-Proposed model:
AUC: 77%
Supervised model:
AUC: 84%
No
El-Attar 2019 [91] *10 FOG-PD (8 froze), 237 episodesLab, straight walking, 180° turns, random instructions and simulated ADL.Acc (1) left shankCombined 1D discrete wavelet transform with FFT features, and used NN for classification. Person-specific detection performance based on classifying FOG episode occurrences.EAccuracy: 96.3%No
Punin 2019 [92,93]1 HC, 1 NFOG-PD, 6 FOG-PD, 27 episodesLab, straight walking, 180° turns, stair climbing.IMU (2) back of ankles (distal posterior shank)Discrete wavelet transform, signal energy compared to person-independent threshold using 32 s windows (256 samples at 8 Hz), updated every second. Detection performance based on classifying FOG episode occurrences.ESensitivity: 60.61%
Specificity: 86.66%
Yes
Mazzetta 2019 [94]7 PD with varying disease severity, tested ON and OFFSimulated apartment, TUG turning both ways, narrow hallways and doorways.IMU/EMG devices shanks (tibialis anterior, gastrocnemius medialis)Multi-stage thresholds using gyroscope and surface EMG. Gyro signal and threshold used to identify beginning and end of each step, then custom R feature compared to person-independent threshold distinguished FOG. Detection performance based on classifying individual steps.EFalse positive rate 5%
False negative rate 2%
No
FOG Prediction
Mazilu 2013 [45] *10 FOG-PD (8 froze), 237 episodesLab, straight walking, 180° turns, random instructions and simulated ADL.Acc (3) left shank, left thigh, lower backAssumed duration of pre-FOG class (1–6 s). 3 class decision tree classifier (pre-FOG, FOG, not FOG) and 1 s window for feature extraction. Person-specific, prediction performance based on classification of individual windows.E, S1 participant with assumed 3 s pre-FOG
F1-score: 0.56
No
Mazilu 2015 [95] **11 FOG-PDLab, walking with cognitive and manual tasks: straight, 180° and 360° turns, narrow spaces and hospital circuit involving elevator, unexpected stops start and turns.Electrocardio-gram (1) (ECG) chest, galvanic skin response (1) (fingertip)Assumed Pre-FOG duration (3 s) used for feature selection. Feature extraction used 3 s window. Multivariate Gaussian distribution used in anomaly detection model. Person-specific model for each individual. Instead of pre-defined pre-FOG length, model decision threshold set manually. Prediction based on number of FOG episode occurrences.E, SSC data predicted 132/184 (71.3%) of FOG episode occurrences on average 4.2 s in advance, 71 false positives.No
Handojoseno 2015 [96]16 FOG-PD, 404 episodesLab, TUG.EEG, headPerson-independent NN trained with 462, 1 s data segments for each class, tested on 172 segments. Extracted multiple frequency-based features using FFT and wavelets, multilayer perceptron NN for classification. Defined pre-FOG as data between 5 s and 1 s prior to FOG.E, SSensitivity: 86%
Precision: 74.4%
No
Zia 2016 [97] *3 chosen randomly from DaphnetLab, straight walking, 180° turns, random instructions and simulated ADL.Acc (1) left shankPerson-specific layered recurrent NN. Detection applied to the 5 s prior to FOG. One participant had best results, trained on 9 episode occurrences, tested on 15.-Best participant:
Sensitivity: 30%
Precision: 89%
No
Palmerini 2017 [98] **18 FOG-PD (11 froze), 180 episodesLab, walking with cognitive and manual tasks: straight, 180° and 360° turns, narrow spaces and hospital circuit involving elevator, unexpected stops start and turns.IMU (3) ankles, lower backAssumed pre-FOG as 2 s before FOG. Features extracted from 2 s windows. Linear discriminant analysis to classify pre-FOG vs normal gait windows. Person-independent model.E, SSensitivity: 83%
Specificity: 67%
No
Handojoseno 2018 [99]16 FOG-PDLab, TUG.EEG, headPerson-independent NN trained with 462, 1 s data segments for each class, tested on 172. Predict FOG by classifying data segment 5 s prior to freeze with Bayesian NN.E, SSensitivity: 85.86%
Specificity: 80.25%
No
Torvi 2019 [100] *10 FOG-PD (8 froze), 237 episodesLab, straight walking, 180° turns, random instructions and simulated ADL.Acc (3) left shank, left thigh, lower backLSTM and RNN with 2 transfer learning approaches. Found best performance with LSTM, trained network then added person-specific final layer. Examined set pre-FOG duration: 1, 3 and 5 s.-Predicted FOG up to 5 s in advance with >90% accuracyNo
* Daphnet dataset originally collected by Bachlin et al. [24] (n = 10, 8 froze during testing). A total of 237 FOG episodes (8 participants OFF, 2 ON who claimed to freeze often while ON). Accelerometers on left shank, left thigh, and lower back. ** CuPiD dataset originally collected by Mazilu et al. [101] (n = 18, 11 froze during testing). 180 FOG episodes (ON/OFF state not mentioned in original article, subsequently reported ON state [58]). [52] reported 182 FOG episodes and [58] reported 184 episodes. IMU (9) on wrists, thighs, ankles, feet, and lower back. Galvanic skin response sensor (1) on hand, ECG sensor (1) on chest, smartphone (1) in front pocket with integrated IMU, pressure sensing shoe insole (1), functional near-infra-red spectroscopy (fNIR) sensor on forehead. *** REMPARK project (Personal Health Device for the Remote and Autonomous Management of Parkinson’s Disease) [102,103]. Data collected by multiple researchers, in participant’s homes in OFF and ON states. Waist worn IMU. **** MASPARK project [104]. Abbreviations and acronyms: Feature extraction (E), selection (S). FOG: freezing of gait; HC: healthy control participants; FOG-PD: people with PD with FOG symptoms; NFOG-PD: people with PD with no FOG symptoms; UFOG-PD: FOG symptoms not reported; ON: on medication, OFF: off medication; Acc: accelerometer, EEG: electroencephalogram; EMG: electromyography; Gyro: gyroscope; IMU: inertial measurement unit, CBMC: camera-based motion capture; CNN: convolutional neural network; NN: neural network; RNN: recurrent neural network; LSTM: long short-term memory neural network; SVM: support vector machine; ADL: activities of daily living; TUG: Timed Up and Go Test; AUC: area under ROC curve; FFT: fast Fourier transform; FI: freeze index [29]; FOGC: freezing of gait criterion; FSR: force sensing resistor; GSR/SC: galvanic skin response/skin conductance; PCA: principal component analysis; PCC: Pearson correlation coefficient; PSD: power spectral density; RMS: Root mean square; ROC: receiver operating characteristic.
Table 2. Features extracted from wearable-sensor data and used for freezing of gait detection or prediction.
Table 2. Features extracted from wearable-sensor data and used for freezing of gait detection or prediction.
Feature NameSensor TypeSensor LocationFeature DescriptionSource
MeanAcc,
Gyro
GSR Goniometer
Telemeters
Chest, wrist, lower back, waist, thigh, knee, shanks, ankle, foot,
GSR: finger,
Goniometers: knees,
Telemeters: between shanks
Mean of signal within window and axis.
Acceleration: 3D vector magnitude or 3 axes
Gyro: Angular velocity 3D vector magnitude, or 3 axes
GSR: Conductance, low-pass filtered at 0.9 Hz
Goniometer: Knee angular rotation.
Telemeter: Voltage output, spikes in signal indicate that legs are next to one another.
[42,45,52,54,57,58,71,74,75,77,80,81,95]
Min, Max, Median, HarmMean, GeoMean, Trim mean, Mode, RangeAcc, GSRShank, thigh, lower backGSR: fingerDescriptive statistics within given window.
Acceleration: 3D vector magnitude, or individual axes
GSR: Conductance, low-pass filtered at 0.9 Hz
[45,64,75,95]
Increment of mean valuesAccWaistDifference between mean of current window and mean of previous window for anterior/posterior acceleration.[57,74,81]
Difference in means of different axesAccWaistDifference in acceleration mean values between axes for current window (X and Y, X and Z, Y and Z).[57,81]
Number of peaks in a windowAccInstrumented pants, Acc (5) waist, thighs and shanksNumber of times relative acceleration signal [105] passes above a threshold during 1.5 s window. Normal reference set to 3. More than 3 peaks per 1.5 s considered possible FOG.[40,41]
Duration of acceleration above thresholdAccInstrumented pants, Acc (5) waist, thighs and shanksTime the relative acceleration signal [105] is above a threshold. Normal reference 0.85 s per 1.5 s window. Longer durations considered suspicious (possibly FOG).[40,41]
Turning degreesGyroLower backAngular rotation about vertical axis. Calculated as the integral of low pass filtered (1.5 Hz) angular velocity about the vertical axis.[98]
Left-right cross-correlationGyroAnklesMaximum cross-correlation between mediolateral angular velocity (de-trended), left and right ankles (0.25 to 1.25 s).[98]
Left-Right average SDGyroAnklesAverage between SD of mediolateral angular velocity (de-trended), of right and left ankles.[98]
RMSAcc,
Gyro
Sole of shoe, shank, thigh, low back, ankle, chestRoot mean square (RMS) of acceleration or angular velocity data in given window, for 3 axes.[45,48,54,78,86]
Inter quantile rangeAcc,
Gyro
Ankle, thigh, chest, and waistInterquartile range of acceleration or angular velocity in given window, for 3 axes.[54]
Standard deviationAcc,
Gyro
GSR
Goniometer (G)
Telemeters (T)
Chest, lower back, waist, thigh, shanks, ankle, foot, wrist, GSR: finger
G: kneesT: between shanks
Standard deviation in given window.
Acceleration: 3D vector magnitude or 3 axes
Gyro: 3D vector magnitude of angular velocity, or 3 axes
GSR: Conductance, low-pass filtered at 0.9 Hz
Goniometer: Knee angular rotation.
Telemeter: Voltage output, spikes in signal indicate that the legs are next to one another.
[42,45,52,54,57,58,71,74,75,77,80,81,88,89,95,98]
VarianceAcc,
Gyro
Shanks, thigh, lower back, waist, ankle, chestVariance in given window. Calculated for acceleration or angular velocity data in given window, for 3 axes. In [83] and [91], variance calculated for FFT signal and detail and approximation coefficients from discrete wavelet transform.[42,45,54,83,91]
Acceleration indicator ( S A C )AccShank, thigh, lower backBinary value, to detect acceleration in each axis
S A C = s g n ( ( X ( X ¯ σ ) ) + ) , where X is a set of acceleration data, X ¯ is mean of X, σ is standard deviation of X, and sgn(a) is a sign function of a while (a)+ returns a only if a ≥ 0, otherwise returns 0.
[20]
Zero velocity and Trembling event intervals (ZVEI, TREI)Acc, GyroHeelDirection of gravitational acceleration used to calculate ZVEI and TREI to determine if foot is stationary (zero velocity) or trembling, from all acceleration and angular velocity axes.[82]
Foot speedAcc, GyroHeelFoot position, orientation, and velocity, from 3 axis acceleration and angular velocity [106].[82]
IntegralAccWaist, shank, thigh, low backIntegral of acceleration in given window, for given axis.[57,74,81,86]
KurtosisAcc,
Gyro
Waist, ankle, shank, thigh low backKurtosis within a given window, from all acceleration axes, angular velocity, acceleration 3D vector, or absolute value of harmonics in 0.04–0.68, 0.68–3 and 3–8 Hz frequency bands (calculated from FFT of 3D acceleration)[45,54,57,74,75,81]
SkewnessAccWaist, shank, thigh, low backMeasure of signal asymmetry within a given window, from all axes of the acceleration, angular velocity, acceleration 3D vector magnitude, or absolute value of harmonics in 0.04–0.68, 0.68–3 and 3–8 Hz frequency bands (calculated from FFT of 3D acceleration).[45,57,74,75,81]
Mean absolute ValueAccShank, thigh, low back
M A V = 1 N n = 1 N | x n |
For acceleration x within a window of N data points. Calculated for 3 axes.
[86]
Simple square intervalAccShank, thigh, low back
S S I = n = 1 N | x n | 2
For acceleration x within a window of N data points. Calculated for 3 axes.
[86]
v-order 2 and 3AccShank, thigh, low back
v 2 = ( 1 N i = 1 N x i 2 ) 1 2 ,   v 3 = ( 1 N i = 1 N | x | i 3 ) 1 3
For acceleration x within window of N data points. Calculated for 3 axes.
[86]
Waveform lengthAccShank, thigh, low back
W L = n = 1 N 1 | x n + 1 x n |
For acceleration x within window of N data points. Calculated for 3 axes.
[86]
Average amplitude changeAccShank, thigh, low back
A A C = 1 N n = 1 N 1 | x n + 1 x n |
For acceleration x within a window of N data points. Calculated for 3 axes.
[86]
Difference absolute standard deviationAccShank, thigh, low back
D A S D V = 1 N 1 n = 1 N 1 ( x n + 1 x n ) 2
For acceleration x within window of N data points. Calculated for 3 axes.
[86]
Maximum fractal lengthAccShank, thigh, low back
M F L = l o g 10 ( n = 1 N 1 ( x n x n + 1 ) 2 )
For acceleration x within window of N data points. Calculated for 3 axes.
[86]
Step lengthAcc, CBMCWaist, thigh, shank, footDistance (m) between consecutive footfalls of the same limb, measured as double integral of A/P acceleration or by camera-based motion capture.[35,71,77]
Step durationGyroThigh, shank, ankle, footDuration (s) between consecutive footfalls of same limb, calculated from angular velocity peaks (raw or filtered)[35,50,71]
CadenceAcc, GyroFeet, shank, thigh, waistNumber of steps in given time (e.g., steps/minute), from time between peaks in angular velocity, vertical acceleration, second harmonic of acceleration in frequency domain [65], or calculated as in [107].[35,49,65,77]
Cadence variationAccWaistStandard deviation of cadence, from last 3 windows.[49]
Stride peaksGyro, Angular velocityShank (ankle)Peak of low pass filtered (4th order Butterworth 10 Hz) angular velocity within gait cycle, in frontal plane.[88,89]
Zero Crossing rate, mean crossing rateAccShank, thigh, low backNumber of times acceleration signal changes between positive and negative. Number of times acceleration signal changes between below average and above average in a given window. Calculated for 3 axes.[45]
Signal vector magnitudeAccShank, thigh, low backSummation of Euclidean norm over 3 axes over entire window, normalized by window length.[45]
PCAAcc
Goniometer (G)
Telemeters (T)
Waist, shank, thigh, low back
G: knees
T: between shanks
Principal component analysis, calculated from raw 3 axis acceleration data from all sensors, each acceleration axis within specific spectral bands, or used to decrease dimensionality of multi-sensor feature set.[45,74,80,81]
Normalized signal magnitude area (SMA)AccShank, thigh, low backAcceleration magnitude summed over 3 axes normalized by window length.[45,75]
Eigenvalues of dominant directions (EVA)AccShank, thigh, low backEigenvalues of covariance matrix of acceleration along all 3 axes.[45]
Energy (time domain)Acc, Gyro, EMG on tibialis anteriorForearm, foot, shank and thigh, waist, EMG: on shinEnergy, where x(n) is discrete signal in time domain, n sample index, T window length, and E signal energy:
E = n = 1 T | x ( n ) | 2
Calculated from each acceleration or angular velocity axis, or from surface EMG signal.
[36,38]
Average acceleration energy (AAE)AccShank, thigh, low backMean of acceleration signal energy over 3 axes.[45]
Asymmetry coefficientAccShank, thigh, low backThe first moment of acceleration data in window divided by standard deviation over window. Calculated for 3 axes.[45]
Freezing of gait criterion (FOGC)Gyro, AccShankCadence and stride length measure, for stride n
F O G C n = C n   L m i n C m a x ( L n + L m i n )
where C n is cadence, L n stride length. Maximum cadence C m a x set to 5 strides/s, and minimum stride length L m i n = 5 cm. Cadence and stride parameters calculated from angular velocity and acceleration [108]
[46,47]
FOG detection on glasses
(FOGDOG)
AccHead
F O G D O G = N s t e p N m a x   ( D r e f D ) D r e f
where D is cumulative forward distance travelled by person during window, D r e f pre-set normal forward distance travelled, N s t e p cadence (number of steps/s), N m a x pre-set maximum normal cadence, forward distance from double integral of forward acceleration after correction for head tilt angle, step length from [109].
[70]
K index, and K’ indexGyroShankSummation of absolute value of low pass filtered angular velocity of left and right shanks in sagittal plane:
k   = l o w p a s s ( | ω l e f t | )   + l o w p a s s ( | ω r i g h t | )
ω l e f t and ω r i g h t are angular velocities in sagittal plane.
[59,60,61,62,72,73]
R valueGyro, angular velocityIMU/EMG devices on shanks (tibialis anterior and gastrocnemius medialis)R value is calculated once for each stride.
R = max ( A B S ) s E M G | t = t max ( A B S )
ABS is absolute value of moving average angular velocity in sagittal plane, sEMG surface EMG signal, max ( A B S ) maximum ABS during a stride, s E M G | t = t max ( A B S ) value of surface EMG at that instant.
[94]
Ratio of height of first peakEMGEMG: shank (tibialis anterior)Height of peak at origin in autocorrelation of filtered EMG signal, in a given window.[38,110]
Lag of first peak (not at origin)EMGEMG: shank (tibialis anterior)Autocorrelation of filtered EMG signal, in a given window.[38,110]
Pearson’s correlation coefficient
(PCC)
Acc, Gyro, FSRShanks, thighs, waist, FSR: under feetSimilarity between two signals, with n sample points,
x i , y i , ith value of x and y signals; means x ¯ , y ¯
P C C = i = 1 n ( x i x ¯ ) ( y i y ¯ ) i = 1 n ( x i x ¯ ) 2 i = 1 n ( y i y ¯ ) 2
Calculated between acceleration axes or between FSR force of a step compared to template “normal” step.
[37,50,57,74,75,81]
Ground reaction forceFSRUnder heel, ball of footSum of forces from all force sensing resistors (FSR) under a foot.[37]
Shank displacementAcc, GyroShanksShank displacement (m) calculated from vertical acceleration and pitch angular velocity [111].[50]
Change of the shank transversal orientationGyroShanksRotation angle in transversal plane, calculated as integral of angular velocity data about vertical axis, for each limb and each stride.[50]
Auto regression coefficientAccWaistFour auto-regression coefficients obtained by Bourg method from acceleration in all 3 axes [112].[57,74,81]
EntropyAcc,
Gyro, EEG
Acc: ankle, pants pocket, waist, wrists, chest, thigh
Gyro: chest, waist, low back
EEG: head
Shannon’s entropy:
H ( x ) = i = 1 n P ( x i ) l o g 2 P ( x i )
where discrete variable x contains n values, P is probability (often defined from histogram), calculated from each axis of acceleration or angular velocity in time and frequency domains, or filtered EEG voltage from multiple scalp locations.
[39,42,43,45,54,64,66,87,96]
Direct transfer functionEEGHeadApplication of coherence directionality in multi-variate time series [113]. Signals from motor control regions: O1-T4 (visual), P4-T3 (sensorimotor affordance), Cz-FCz (motor execution) and Fz-FCz (motor planning). Data filtered band-pass (0.5–60 Hz), band-stop (50 Hz), then normalized with a z-transformation.[99]
ICAIndependent component analysisEEGHeadIndependent component analysis, used to maximize separation between signal components. Signals from motor control regions: O1-T4 (visual), P4-T3 (sensorimotor affordance), Cz-FCz (motor execution) and Fz-FCz (motor planning). Data filtered bandpass (0.5–60 Hz), band-stop (50 Hz), then normalized with a z-transformation.[99]
Raw FFTAcc, gyro, Goniometer (G)Waist, shank, G: knee jointThe output signal from FFT. Calculated using acceleration, derivative of knee angle or angular velocity in the sagittal plane, in given window.[35,50,64]
PSD bandsAcc, EEG,
Goniometer (G),
Telemeters (T)
Heels, shank and thighs, knee, shanks.
G: knee
T: between shanks
Specific frequency bands of power spectral distribution (PSD), generated by FFT, short-time FFT (SFFT), Z-transformation, or other method to convert time domain signal into frequency domain. Calculated from each acceleration and angular velocity axis, knee angular rotation, telemeter voltage, or filtered EEG voltages.[30,55,66,80]
Ratio of peak frequenciesGoniometerKnee angleComputed from FFT of derivative of knee angle. Ratio of highest amplitude in 3–8 Hz divided by highest amplitude in 0.3–3 Hz. [35]
Power in frequency domainAccAnkle, shanks, thighs, waist, chest, wristsArea under curve of power spectral density plot, between specific bands. From acceleration 3D vector magnitude or individual axes.
Also, ratio of specific bands (similar to FI) [86].
[24,25,31,32,33,34,40,42,52,54,58,75,80,86]
Freeze index (FI)
IMU (Acc),
Goniometer (G),
Telemeter (T)
Acc: Various locations and sensor orientations, G: knees, T: between shanksRatio of signal power in freeze band (3–8 Hz) and locomotion band (0–3 Hz) [29]
F I = A r e a   u n d e r   t h e   P S D   c u r v e   i n   f r e e z e   b a n d A r e a   u n d e r   t h e   P S D   c u r v e   i n   l o c o m o t i o n   b a n d
Calculated from acceleration and angular velocity axes, 3D vector magnitude, knee angular rotation or telemeter voltage.
[23,24,25,29,31,32,33,34,42,44,49,53,54,64,65,69,71,77,78,80,86,88,89,98]
Multi-channel FI ( F I M C )AccFoot, shank, thigh, lower back/hipRatio of powers P H to P L (i.e., freeze and locomotor bands) that are summations of acceleration signal powers over N channels, where Matrix X of size N × M represents an N-channel recording session with M regularly spaced time samples
F I M C = P H P L
P H = 1 2 f s n = 1 N [ i = H 1 + 1 H 2 [ P X X n ( i ) ] + i = H 1 H 2 1 [ P X X n ( i ) ] ]
P L = 1 2 f s n = 1 N [ i = L + 1 H 1 [ P X X n ( i ) ] + i = L H 1 1 [ P X X n ( i ) ] ]
where N is number of inputs, f s sampling frequency, P X X , power spectrum of signal x, H 1 = 3 N F F T f s , H 2 = 8 N F F T f s , L = 0.5 N F F T f s
[67]
K freeze index ( F I K )AccFoot, shank, thigh, lower back/hipFreeze index from each acceleration signal axis, spectral analysis using the Koopman operator [114]. Koopman eigenvalues and eigenfunctions are considered frequencies (λ) and power (K(λ)) [115].
F I K = λ = H 1 + 1 H 2 K ( λ ) λ = L + 1 H 1 K ( λ )

where L = 0.5 ( 2 π ) ,   H 1 = 3 ( 2 π ) ,   H 2 = 8 ( 2 π )
[67]
Total powerAccLower back, thigh, shank
T T P = j = 1 M P j
where P is the power spectrum of the acceleration signal for a window of length M [116,117]. Calculated for 3 axes
[86]
Mean powerAccLower back, thigh, shank
M N P = 1 M j = 1 M P j
where P is power spectrum of acceleration signal for window of length M [116,117]. Calculated for 3 axes.
[86]
Energy Derivative ratio (EDR)AccLateral waistDerivative of vertical acceleration energy in 3–8 Hz band divided by derivative of energy in 0.5–3 Hz band.[49,77]
Median frequencyAccLower back, thigh, shank
M D F = 1 2 j = 1 M P j

where P is the power spectrum of acceleration signal for a window of length M [116,117]. Calculated for 3 axes.
[86]
Peak frequencyAccLower back, thigh, shank
P K F = m a x ( P j ) ,   j = 1 ,   ,   M
where P is power spectrum of acceleration signal for a window of length M [116,117]. Calculated for 3 axes.
[86]
Peak amplitude, Frequency of peak amplitudeAccWaist, thighs, shanksMaximum value in frequency domain and corresponding frequency bin. Calculated for [0.5–3 Hz] band and [3–8 Hz] band. In [41] relative acceleration signal is used, defined in [105].[41,64]
Higher harmonicsAccWaist, shanks 3 frequency bins with highest peaks. Calculated for all acceleration axes.[57,64,74]
Frequency standard deviationAccWaist, thighs, shanks,
FSR in-shoe insoles
Standard deviation of signal in specific frequency bands,
e.g., 0.1–0.68 Hz, 0.68–3 Hz, 3–8 Hz, 8–20 Hz, 0.1–8 Hz. Calculated for 3 axes.
[57,64,74]
Spectral density centre of mass (COM)Acc, EEG,
Goniometer (G), Telemeters (T)
Acc: Waist, thigh, shank, foot,
EEG: Head, G: knee,
T: between shanks
x(n), is amplitude of bin n, and f(n) is frequency of bin n:
C O M = n = 0 N 1 f ( n ) x ( n ) n = 0 N 1 x ( n )
Calculated from 3 axis acceleration signal, filtered EEG voltage calculated within specific frequency bands, knee angular rotation or telemeter voltage.
[57,66,74,81,86,96]
1st 2nd 3rd spectral momentsAccLower back, thigh, shank
S M 1 = j = 1 M f j P j ,     S M 2 = j = 1 M f j 2 P j ,     S M 3 = j = 1 M f j 3 P j
where P is power spectrum of acceleration signal for window of length M [116,117]. Calculated for 3 axes.
[86]
Spectral coherenceAcc, EEGLower back, thigh, shank, EEG: headCalculated from 3D acceleration or filtered EEG data using Welch method [118]
C x y ( ω ) = P x y ( ω ) P x x ( ω )   P y y ( ω )
where ω is frequency, P x x ( ω ) is power spectrum of signal x, P y y ( ω ) is power spectrum of signal y, and P x y ( ω ) is cross-power spectrum for signals x and y. Also used with wavelet power spectrum in [96]. EEG signal from 4 locations: O1-visual, P4-sensorimotor affordance, Cz-motor execution, and Fz-motor planning. Filtered bandpass (0.5–60 Hz).
[20,67,69,96]
Max amplitude and number of peaks of spectral coherenceAccFoot, shank, thigh, lower back/hipMaximum amplitude and number of peaks of spectral coherence feature [20].[67]
Discrete wavelet transform (DWT)Acc, EMGLower back, thigh, shank, EMG: quadricepsDiscrete wavelet transform, Decomposition coefficients (approximate and detail coefficients) used as features. Calculated from the acceleration 3D vector magnitude each axis individually, or the raw EMG signal.[51,56,71,79]
Select bands of the CWTAccLower back, thigh, shankContinuous wavelet transform in specific ranges (0.5–3 Hz, 3–8 Hz), also ratio of signal in 0.5–3 Hz band divided by signal in 0.3–8 Hz. Calculated for 3 axes.[63]
Ratio of peak amplitude in wavelet transform bandsGoniometer,Knee, derivative of knee angleSinusoidal wavelet transform used to calculate ratio of peak amplitude in 3–8 Hz band divided by peak in 0.5–3 Hz band.[35]
S-transform, amplitudeEEGHeadMaximum amplitude in theta (4–8 Hz), alpha (8–13 Hz), low beta (lβ, 13–21 Hz) and high beta (hβ, 21–38 Hz) bands. Total amplitude across all bands were extracted for a specific time. Electrodes placed: F3, F4, FC1, FC2, C3, C4, CP1, CP2, CZ, P3, P4, PZ and O1, O2, OZ (F = frontal, C = central, P = parietal, O = occipital and Z = midline). Data filtered band-pass filter (0.5–40 Hz), normalized with z-transformation.[76]
Energy (frequency domain)Acc, GyroFoot, shank, thigh, forearm, waist, chest, ankleSummation of squared absolute value of signal, where f(h) is discrete signal in frequency domain, with frequency bins h = 1 to H, and E is signal energy
E = h = 1 H | f ( h ) | 2
Calculated from 3 axis acceleration or angular velocity signal.
[42,49,50,54,75,77,119]
Min, max amplitude of FFT and DWTAccShank, thigh, low backMinimum and maximum values of energy of frequency domain signal, for both FFT and DWT approximation and detail coefficients, as in [92,93]. Calculated from 3 axis acceleration signal.[83,91,92,93]
Cross-correlationEEGHead
R x y ( k ) = E [ x ( n ) y ( n + k ) ]
where x ( n ) , and y ( n + k ) are two signals and k is the number of time units that signal y ( n ) lags x ( n ) , and E [ · ] is expectation operator. EEG signal from O1-visual, P4-sensorimotor affordance, Cz-motor execution, and Fz-motor planning. Filtered band-pass (0.5–60 Hz).
[96]
Cross power spectral density (CPSD)EEGHeadCross power spectral density [120]
P x y ( f ) = k = R x y ( k ) e j 2 π f k T
where R is cross correlation function. EEG signal from 4 locations: O1-visual, P4-sensorimotor affordance, Cz-motor execution, and Fz-motor planning. Filtered band-pass (0.5–60 Hz).
[96]
Weighted Phase Lag Index (WPLI)EEGHeadWeighted phase lag index [121]. EEG signal from 4 locations: O1-visual, P4-sensorimotor affordance, Cz-motor execution, and Fz-motor planning. Filtered band-pass (0.5–60 Hz).[96]
Wavelet cross spectrumEEGHeadThe wavelet cross spectrum W C S i ( s ) , defined as
W C S x y i ( s ) = S ( W x i ( s ) W y i * T ( s ) )
where x and y are two time series, i time shift index, s scale, S a smoothing operator, and W x i and W y i the wavelet transform coefficients. EEG signal from 4 locations: O1-visual, P4-sensorimotor affordance, Cz-motor execution, and Fz-motor planning. Filtered band-pass (0.5–60 Hz).
[96]
Phase locking valueEEGHeadPhase locking value [122]
P L V t = 1 N | n = 1 N N e j θ ( t , n ) |
where θ ( t , n ) is phase difference between signals which can be derived from the angles of their wavelet coefficients. EEG signal from 4 locations: O1-visual, P4-sensorimotor affordance, Cz-motor execution, and Fz-motor planning. Filtered band-pass (0.5–60 Hz).
[96]
Table 3. Top machine-learning methods from studies that compared different machine-learning classifiers for FOG detection using wearable sensors.
Table 3. Top machine-learning methods from studies that compared different machine-learning classifiers for FOG detection using wearable sensors.
Machine-Learning Methods TestedBest MethodSecond BestThird BestSource
Random forests, decision trees, naive Bayes, k-nearest neighbor (KNN-l) (KNN-2), multilayer perceptron NN, boosting (AdaBoost) and bagging with pruned decision trees.AdaBoosted decision tree
(1 s window)
Random forest
(1 s window)
Bagging with decision tree
(1 s window)
[42]
Naïve Bayes, random forest, decision trees, random tree.Random forest
(1 s window)
Decision tree
(1 s window)
Random tree
(1 s window)
[43]
k-nearest neighbor, random forest, logistic regression, naïve Bayes, multilayer perceptron NN, support vector machine.Support vector machine
(1.6 s window)
Random forest
(1.6 s window)
Multilayer perceptron NN
(1.6 s window)
[57,81]
Convolutional NN, decision trees with bagging, Adaboosting, logitBoost, RUSBoost, robustBoost) support vector machine.Convolutional NN
(2.56 s window)
Support vector machine
(2.56 s window)
RUSBoost
(2.56 s window)
[85]

Share and Cite

MDPI and ACS Style

Pardoel, S.; Kofman, J.; Nantel, J.; Lemaire, E.D. Wearable-Sensor-Based Detection and Prediction of Freezing of Gait in Parkinson’s Disease: A Review. Sensors 2019, 19, 5141. https://doi.org/10.3390/s19235141

AMA Style

Pardoel S, Kofman J, Nantel J, Lemaire ED. Wearable-Sensor-Based Detection and Prediction of Freezing of Gait in Parkinson’s Disease: A Review. Sensors. 2019; 19(23):5141. https://doi.org/10.3390/s19235141

Chicago/Turabian Style

Pardoel, Scott, Jonathan Kofman, Julie Nantel, and Edward D. Lemaire. 2019. "Wearable-Sensor-Based Detection and Prediction of Freezing of Gait in Parkinson’s Disease: A Review" Sensors 19, no. 23: 5141. https://doi.org/10.3390/s19235141

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop