Next Article in Journal
RF Transceiver for the Multi-Mode Radar Applications
Next Article in Special Issue
Clinical and Research Solutions to Manage Obstructive Sleep Apnea: A Review
Previous Article in Journal
Classification of Critical Levels of CO Exposure of Firefigthers through Monitored Heart Rate
Previous Article in Special Issue
Which Are the Central Aspects of Infant Sleep? The Dynamics of Sleep Composites across Infancy
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Systematic Review of Sensing Technologies for Wearable Sleep Staging

Wearable Technologies Lab, Imperial College London, London SW7 2AZ, UK
Sensors 2021, 21(5), 1562; https://doi.org/10.3390/s21051562
Submission received: 6 January 2021 / Revised: 13 February 2021 / Accepted: 20 February 2021 / Published: 24 February 2021

Abstract

:
Designing wearable systems for sleep detection and staging is extremely challenging due to the numerous constraints associated with sensing, usability, accuracy, and regulatory requirements. Several researchers have explored the use of signals from a subset of sensors that are used in polysomnography (PSG), whereas others have demonstrated the feasibility of using alternative sensing modalities. In this paper, a systematic review of the different sensing modalities that have been used for wearable sleep staging is presented. Based on a review of 90 papers, 13 different sensing modalities are identified. Each sensing modality is explored to identify signals that can be obtained from it, the sleep stages that can be reliably identified, the classification accuracy of systems and methods using the sensing modality, as well as the usability constraints of the sensor in a wearable system. It concludes that the two most common sensing modalities in use are those based on electroencephalography (EEG) and photoplethysmography (PPG). EEG-based systems are the most accurate, with EEG being the only sensing modality capable of identifying all the stages of sleep. PPG-based systems are much simpler to use and better suited for wearable monitoring but are unable to identify all the sleep stages.

1. Introduction

It is estimated that more than 3.5 million people in the United Kingdom and more than 70 million people in United States suffer from some sort of sleep disorder [1,2]. Apart from having a major impact on the quality of lives of individuals, sleep disorders also result in a huge financial impact on the economy stemming from expensive treatments, reduced productivity, road accidents, and many other areas that involve alertness and quick judgement [3,4]. Diagnosis of sleep disorders is an expensive procedure that requires performing a sleep study known as polysomnography (PSG) to monitor multiple parameters and physiological signals during sleep [5]. These may include neural activity (EEG), eye movements (EOG), muscle activity (EMG), heart rhythms (ECG), and breathing functions. Together these signals, or a combination of them, are used to perform sleep scoring and to identify, or rule out, the presence of multiple sleep disorders.
Polysomnography is usually performed at sleep clinics in the presence of trained technicians, typically using four to six EEG electrodes; two EOG electrodes; four EMG electrodes; two ECG electrodes; and additional sensors such as pulse oximeter, sound probes, etc. The signals obtained from these sensors are analysed and scored into one of the different stages of sleep. Human sleep is broadly classified into into Wake, REM (rapid eye movement), and NREM (non-rapid eye movement) stages [6]. The NREM stage is further divided into N1, N2, and N3 stages (or S1, S3, S3, and S4 based on the older Rechtschaffen and Kales (R&K) rules for classification) [7]. The process of assigning a sleep stage to each block of 30-second PSG signal is known as sleep staging or sleep scoring.
Despite the importance of PSG for the diagnosis of sleep disorders, its high cost coupled with the necessity of clinical admission and long waiting lists [8] limits its usage outside of specialised clinics. Additionally, manual analysis and scoring of sleep from PSG traces is a tedious task that can take 2–4 h for scoring data from an entire night sleep [9]. The advent of low-power wearable sensors and edge computing has resulted in several systems being developed to make PSG easier and to make it possible to perform sleep monitoring with automatic sleep detection and staging over multiple nights at the home of patients [10]. Researchers have used several sensors to obtain physiological parameters during sleep and developed various methods for their processing to classify them into sleep stages. Two intrinsically linked areas of research have emerged, with one focusing on signal acquisition and the other on signal processing. In the case of signal acquisition, there is a push towards finding alternative sources of sleep information than those traditionally used for PSG. This involves developing new sensors, interfacing circuitry, inplementing low-power hardware, as well as integrating everything into smaller packages. The signal processing research focuses heavily on using techniques to perform feature extraction and on taking advantage of the advances in machine learning for classification.
While several review papers have focused on the algorithmic accuracy of different methods for automatic sleep detection and staging using a variety of physiological signals [11], this paper presents a systematic review of the sensing technologies available to acquire those signals. It focuses on systems and methods that are useful for long-term sleep monitoring through the use of home-based wearable devices. It presents a comprehensive discussion of the relevant sensing technologies focused on their suitability, limitations, usability, as well as their ability to extract signals that can reliably be used for automatic sleep detection and staging. The rest of this paper is organised as follows. Section 2 describes the review methodology used to identify candidate papers for this review. Section 3 synthesises the results presenting an overview of the different sensing modalities that are used for wearable sleep detection and staging. For each sensing modality, the challenges associated with signal acquisition, reliability, usability, and accuracy are discussed as well as a review of the different papers that have used the sensing modality in the context of automatic sleep staging. Finally, Section 4 discusses the overall landscape of sensing technologies looking at the different trade-offs that need to be taken into account when designing a wearable device for sleep detection and staging.

2. Methodology

2.1. Data Sources

This systematic review is performed using the guidelines of the Preferred Reporting Items for Systematic Reviews and meta-Analysis (PRISMA) statement [12]. It includes peer-reviewed articles written in English language published between 2010 and 2021. The lower limit of year 2010 is used because of the more recent developments in the area of wearable computing and systems. The initial list of articles was compiled by searching on the PubMed, SCOPUS, and IEEEXplore databases. The electronic search terms used with PubMed (as well as other databases) included wearable sleep staging, wearable sleep scoring, wearable sleep, wearable sleep eeg, wearable sleep ecg, wearable sleep sensors, wearable sleep breathing, wearable sleep plethysmography, wearable sleep ppg, wearable sleep actigraphy, and wearable sleep movement. The search made use of the binary OR operator to look for the presence of these terms in the title, abstract, or keywords of the articles and limited the publication year to be at least 2010. Additional articles were obtained from the bibliographies of articles found. The date of the last search was 8 February 2021. The initial search resulted in 813 articles that were further screened to remove duplicates and assessed against the eligibility criteria described in the next section. A flow diagram for identification, screening, and eligibility of the papers to be included in the review is shown in Figure 1.

2.2. Eligibility Criteria

The initial set of articles were filtered to remove the duplicates and to narrow down the scope in order to include only articles that were focused on designing a system or algorithm for wearable sleep detection and staging. This may include either a sensor designed specifically for sleep staging, including distinction between sleep and wake, or a method developed using data from a sensing source that could be used in a wearable context. This resulted in 181 articles that were deemed relevant. This number was further reduced to 90 after the full-text was assessed against the following eligibility criteria.
  • Does the article present a sensor/method for sleep staging?
  • Does the sensor/method in article classify at least one stage of sleep (including wake as it is considered a distinct stage based on both the R&K and AASM rules of classification)?
  • Is the sensor/method for use for staging human sleep?
  • Does the article discuss potential use of the sensor/method in wearable context? If not, is the sensor/method obviously suitable for wearable use (e.g., wrist-based devices)?
  • Is the sensor/method designed for monitoring overnight sleep (i.e., not only daytime naps)?
  • Is the sensing modality clearly defined?
  • Has the work being presented been validated against a reference method?
  • Does the study report at least one quantifiable measure of accuracy?

2.3. Data Extraction

Data from each study was extracted into a spreadsheet that had columns for all the variables sought for this review. Where relevant, accuracy measures were computed from the reported metrics in the paper and entered into this spreadsheet. From each study presenting a system or method for automatic sleep staging, data were sought on the sensing modality of the system used for data collection, the classification accuracy of the method, the sleep stages being classified, the validation reference, the location of data collection, the age range of subjects, population size, and the source of data (i.e., an actual device or an existing database).

2.4. Risk of Bias

There is a risk of bias in individual studies reporting results based on different validation references. Even when the validation reference is the same, manual scoring by sleep technicians is subjective and hence may have some influence on the reported results. Additionally, sample size, study location, stages classified, and the reporting metrics themselves can also impact the results. To minimise these risks, two well-known measures of overall accuracy have been used and only those studies that report at least one of these measures were included. Furthermore, the other variables that may have an impact are clearly reported in the results.
Overall, the studies that have been included focus on systems and methods designed for use in wearable sleep systems. Many studies do not mention the wearable or ambulatory nature of devices. Hence, an assessment was made for each study (with insufficient context) to determine its eligibility against a subjective criterion of its possible use in a wearable context.

3. Results

Figure 2 shows the frequency of articles published each year. The trend is very clear, with the number of published articles increasing from 2014 onwards, with a small number of articles before that time. This is not surprising since ultra-low power processors and miniaturised electronics have become readily available in the last few years, leading to an exponential increase in research in the area of ambulatory and wearable devices. Additionally, there has also been an increased focus on sleep and its importance for overall health and wellness, leading to a rise in wearable devices being developed to monitor sleep.
Table 1 lists the eligible articles included in this review, summarising their sensing modalities, extracted signals, and sleep stages classification. The different sensing modalities together with their usability, accuracy, and ability to classify different sleep stages are discussed in the following sections.

3.1. Overview of Different Sensing Modalities for Sleep Staging

The gold standard for sleep staging is polysomnography (PSG), which involves monitoring of brain waves, eye movements, and muscle movements amongst other physiological parameters. However, due to the difficulties in using PSG outside of clinics, alternative sensing modalities have been proposed for extracting information related to the different stages of sleep. Table 1 lists 13 different sensing modalities that have been used by researchers to extract various signals for sleep detection and staging. Before discussing these sensing modalities, a brief overview of their usage for sleep detection and staging is presented.

3.1.1. Electroencephalogram

An electroencephalogram (EEG) is a recording of the electrical signals of the brain. It is performed by placing several electrodes on different locations of the scalp. The electrodes then feed the signals into a front-end electronic system consisting of amplifiers, filters, and other data acquisition circuitry. EEG signals are highly useful to assess the health of the brain as well as the diagnosis of different sleep and neurological disorders. As part of PSG, for sleep staging, at least three channels of EEG are used to acquire signals from different locations on the scalp. The relative levels of signals at different frequency bands (e.g., alpha (8–13 Hz), delta (0.5–4 Hz), etc.) are then used to identify the sleep stages following the guidelines of the American Academy of Sleep Medicine (AASM) [6]. Although EEG is the signal with the richest information related to sleep, it is difficult to use due to multiple issues such as electrode displacement and noise [133,134,135].

3.1.2. Electrooculogram

An electrooculogram (EOG) is a signal that is generated as a result of eye movements and captured using electrodes placed near the eyes. It is highly useful to identify the wake and REM stages of sleep since there are major eye movements during these stages. Generally speaking, eye movements tend to slow down with the depth of sleep.

3.1.3. Electromyogram

An electromyogram (EMG) is a recording of the electrical signals generated as a result of muscle movements. Electrodes placed on specific muscles result in changes in the signal level whenever those muscles move. As part of full PSG, leg and chin muscle movements are recorded through EMG. Leg movement activity is useful for the diagnosis of specific sleep disorders such as periodic limb movement disorder. Chin movement activity helps to differentiate wake and REM sleep stages from those with similar EEG characteristics.

3.1.4. Electrocardiogram

An electrocardiogram (ECG) is a recording of the electrical activity of the heart. It provides a snapshot of the regular functioning of the heart as well as information about the heart rate. ECG is not conventionally used for sleep staging and is not a required signal for scoring sleep based on AASM guidelines [6]. However, researchers have shown that different features extracted from ECG such as R-R intervals, heart rate, and heart rate variability (HRV) correlate with the changes in sleep macrostructure and thus can be useful for identifying the different stages of sleep [136].

3.1.5. Photoplethysmogram

A photoplethysmogram (PPG) is a signal representing the changes in blood volume in the microvascular bed of tissue [137]. It is obtained by using a simple optical measuring technique where an LED is used to shine light on the tissues and a photodetetor is used to absorb the reflected light. The PPG can then be used to measure the heart rate, resipiratory rate, and oxygen saturation levels. In clinical practice, PPG-based pulse oximetry is popularly used for continuous measurement of oxygen saturation. However, it suffers from reliability issues mainly due to artefacts emanating from movement of the finger. More recently, wearable PPG sensors have been incorporated in wrist-worn consumer devices to extract the heart rate, making it widely accessible [95,138]. This is useful for sleep monitoring since researchers have shown that the changes in heart rate and respiratory rate can be useful for identifying the different stages of sleep [139,140].

3.1.6. Accelerometer

Accelerometers are inexpensive and easy-to-use sensors that record periods of movement and inactivity during sleep and can be used to assess the sleep/wake cycles. They are used for long-term sleep monitoring through wrist-worn devices, also known as actigraphy [141].

3.1.7. Respiratory Inductance Plethysmography

Respiratory inductance plethysmography (RIP) is a noninvasive tool to monitor breathing patterns. It uses a belt placed around the abdomen (or thorax) and measures abdominal movements that are correlated with respiratory movements during inspiration and expiration. Since the respiratory effort changes throughout sleep [142], the RIP signal can be useful for studying and identifying different stages of sleep.

3.1.8. Pressure Sensors

Some researchers have developed sensitive pressure sensors installed on a mattress to record body motion and ballistocardiography (BCG) during sleep. These sensors are used to extract information such as respiratory rate, heart rate, and movements in bed [143,144]. These can be useful for identifying different sleep stages since the body movement is reduced in deep sleep whereas respiratory rate changes are also observed compared to light sleep or wake stages.

3.1.9. Radar Sensors

Radar sensors are used as a non-contact and wireless method of breathing and movement detection [145]. These signals can then be used for sleep staging.

3.1.10. Audio

Microphones placed in various configurations are used to record audio signals during sleep. These signals capture the changes in breathing sound intensity, snoring, and other sounds due to abrupt movements that can be used for sleep staging [146].

3.1.11. Nasal Airflow

Thermal airflow sensors (thermistors or transducers) are used to measure airflow through the nose. The nasal airflow signal shows changes in breathing, which can be helpful for identifying sleep stages [147].

3.1.12. Sonar Sensors

Similar to radar sensors, sonar sensors can also be used as a non-contact and wireless method of breathing and movement detection and subsequently for detecting different sleep stages.

3.1.13. Electrodermal Activity Sensors

Electrodermal Activity (EDA) sensors are used to measure skin conductance at the wrist, palm, or fingers. The EDA levels are known to be higher during deep stages of sleep compared to the lighter ones and hence can be used to differentiate N2 and N3 stages from others [148].

3.2. Discussion on Sensing Modalities

The different articles included in this review used at least one or more of the sensing modality described in the previous section. Table 2 shows the number of articles that used each of the sensing modalities in their sleep staging method or system either as the only source of signal information or in combination with other sensors.
Of the 90 articles reviewed, 61 used one of the 13 sensing modalities as the only source of information whereas the others used a combination of two or more of the different sensing modalities. It is not surprising that EEG is the most popular choice of sensing modality when it comes to creating wearable sleep detection and staging systems, where only a single sensing source is used. This is because it is part of the PSG gold standard and the AASM guidelines for sleep scoring are based on the interpretation of EEG signals. Hence, as a signal source, it is considered the one providing the most information that can be clearly used to classify the different stages of sleep while adhering to the medical guidelines. Following EEG, the second most used single-source sensing modality is the accelerometer. This is mainly a consequence of the prevalence of their use in actigraphy, which allows for capturing sleep/wake patterns. The third most used single-source sensing modality is the ECG. This is perhaps due to its ability to provide accurate heart rate and heart rate variability that has been shown to correlate with the different sleep stages [136]. The other sensing modalities are much less commonly used on their own.
Accelerometers, on their own, have been in use for a very long time for actigraphy to detect sleep and wake states. While they do not provide more detailed sleep staging information, they are considered an acceptable and useful method to assess and treat disorders related to sleep patterns [149]. When used in combination with other sensors, accelerometers can be used reliably to refine the sleep staging outputs. For example, when used with EEG, they may be helpful in differentiating between wake and REM stages that have similarities in the EEG and usually require further input from EOG and EMG channels (however, this will not be possible during periods of quiet wakefulness with little to no body movements). Because of their ease of use for patients as well as their low cost of development, accelerometers are the most common sensing modality used in combination with other sensors. The other most commonly used sensing modality in combination with other sensors is the PPG. This is because of the recent explosion in wrist-worn consumer wearables that incorporate PPG sensors providing heart rate information that can be subsequently used for sleep detection and staging. From Table 1, it can be seen that PPG is most commonly used with accelerometers. However, these two sensors together commonly provide 4-state sleep staging rather than full five-state sleep staging. EEG and ECG sensing modalities are also used at times with other sensors despite being quite information rich on their own. In most of the cases where EEG and ECG are used with other sensors, they are the predominant source of information using other sensors only to refine and improve the classification accuracy.
It should be noted though that the use of accelerometers as a sensing modality is highly prevalent for actigraphy studies. They have been in use for a long period of time, are inherently wearable and cost effective, and may be used to target generalised sleep disorders; there are more studies published using such systems. They are also available in millions of consumer devices around the world, making them easier to use for various studies. As a result, it is likely that more studies have been published using these systems, resulting in a higher number of papers based on them.

3.3. Signals and Features

When the sensing modality of EEG is used, the signals extracted are brain waves. The main difference between different systems that use EEG is the number of channels and the channel location. For example, single-channel EEG-based systems may prefer frontal channels over the central ones due to the ease of using the electrode. Nevertheless, regardless of which channels are used, the main objective is to interpret their spectral content and to score the signal based on the AASM guidelines. The other sensing modalities, however, acquire physiological signals and process them to score sleep stages despite the lack of any standardised guidelines for those signals. These include cardiac signals, respiratory signals, and body movements.
The different features of cardiac signals that have been used include the heart rate, heart rate variability, and R-R intervals. These may in turn be extracted from different sensing modalities. For example, the heart rate (and pulse rate) is commonly obtained using ECG or PPG. The respiratory feature most commonly extracted is breathing rate. A number of different sensing modalities are used to acquire respiratory signals. These include RIP to measure the effort, audio sensors to record breathing sounds, and non-contact radar sensors and pressure sensors to measure movements, which are subsequently used to extract the breathing rate. Additionally, tri-axial accelerometers have also been used to measure chest movement while breathing to obtain the breathing rate. When placed on wrists, accelerometers are commonly used to obtain general body movement signals that are used in actigraphy.
Of all the articles reviewed in this paper, 19 use signals and features extracted from EEG. Amongst the articles presenting non-EEG-based systems, more than 35 use at least one feature extracted from cardiac signals. This is followed by 24 articles using movement signals and 17 using respiratory features. Thus, it can be concluded that, when using non-EEG sensing modalities, cardiac signal features are the most popular choice for obtaining sleep staging information.

3.4. Sleep Stages

The AASM defines five different stages of sleep (wake, N1, N2, N3, and REM), whereas the previous R&K guidelines defined seven stages (wake, S1, S2, S3, S4, and REM). Of all the reviewed articles, only 21 performed classification of all the sleep stages defined by either of these guidelines [24,39,42,45,50,58,67,72,76,80,88,89,90,93,101,102,105,108,109,117,121]. Amongst them, all but three [24,58,150] used EEG signals for classification, where the difference between sleep stages is known to be most obvious. In [150], where only PPG signals are used with accelerometry data to detect movements, the overall accuracy is quite low. This highlights the fact that a clear distinction between different stages is challenging with limited information, which is a consequence of the constraints in a wearable device. It is well known that, even when using EEG signals, the identification of N1 and REM sleep stages are rather challenging [151] and thus require additional sensing modalities in the form of EOG and EMG. Hence, some researchers combine N1 and N2 stages and refer to it as the light stage of sleep with N3 (or S3 and S4 combined) referred to as deep sleep. Others opt to group sleep into three classes only: wake, NREM, and REM. These approaches are most commonly seen used in systems where the sensing modality is based on PPG, ECG, and radar. Where only accelerometry is used, the only reliable information that can be inferred is the distinction between sleep (no movement) and wake (movement) stages.
Grouping sleep stages together to form a different class helps to improve classification accuracy. However, apart from accelerometry-based actigraphy systems, the usefulness of this approach for medical diagnosis is still to be determined. It is perhaps for this reason that popular wrist-worn PPG-based devices are not regulated for use as medical devices and only marketed as consumer devices.

3.5. Accuracy

The accuracy of classification of different sleep stages not only depends on the type of signal obtained from different sensing modalities but also on the algorithms developed to process them further. Direct comparison between the performance of different systems is difficult due to a number of factors [152], including the fact that the metrics used by the authors of different studies can be very different. Further, since there are multiple sleep stages, Cohen’s kappa is a better measure to indicate the accuracy of classification. Consequently, both the reported classification accuracy and kappa values are included in Table 1. It should be noted that these values have been obtained from the papers and that, if they were unavailable, they were computed from the reported results (where sufficient information on classification was available). The studies being performed in different locations with different sample sizes and different stages being classified represent a potential source of bias in the reporting of results. Hence, Table 1 lists the various potential bias sources, and the accuracy needs to be looked at in context. While the effect of study location is not entirely clear, it is likely that those studies classifying smaller number of sleep stages achieve higher accuracy.
Despite the accuracy reporting challenges, it can be noted, with some exceptions, that the majority of of PPG-based three- or four-class systems are able to achieve classification accuracies between 65–75%, with kappa values between 0.5–0.6. This is because some of these systems using PPG and accelerometry together tend to overestimate N3 sleep and to underestimate other stages [153]. Those using EEG as the signal source are able to achieve classification accuracies between 80–90%, with the highest kappa values of over 0.8 indicating strong agreement. If the classification stages are limited to sleep and wake classes, then accelerometers can also achieve 90% accuracy. Additionally, classification based on signals obtained from pressure and radar sensors for the wake, NREM, and REM stages are shown to achieve accuracy up to 80%. In general, systems based on EEG have the highest accuracy and kappa values whereas others based on accelerometers and non-contact sensors have the lowest for 5-state staging [154,155]. While EEG-based methods achieve higher classification compared to other sensing modalities, it should be noted that the studies included in this review with EEG sensing modality predominantly use a single channel or limited number of channels for data acquisition. This is because of the usability challenges associated with multi-channel EEG systems that make them unsuitable for wearable sleep staging. However, while better than other sensing modalities, the accuracy of single-channel EEG systems are reduced when compared to multi-channel sleep staging systems. In particular, a loss in accuracy is more pronounced in the REM and N1 stages, resulting in a decrease in overall agreement [60].

3.6. Usability

Wearable devices are a relatively recent development, but several usability studies have already been published to understand the requirements to make them accessible for a wide range of population [156,157]. Although the usability requirements of wearable sleep staging devices are similar to those of other wearables, where different trade-offs pertaining to the battery life, power consumption, human factors, size, and weight need to be taken into account, there are certain additional constraints [158]. For example, since they are likely to be worn just before sleep, they need to be very easy and quick to put on. If this is not the case, user compliance will be very low. Additionally, compared to wearables used during the day, devices designed to monitor sleep need to be more comfortable to avoid disrupting sleep. These are the main reasons why polysomnography is difficult to use on a regular basis outside of clinics without the supervision of trained staff to connect the various electrodes.
The general trend with wearable sleep detection and staging systems is to use a reduced subset of PSG sensors. Most notably, these include using ECG or one-channel EEG that is considerably easier to use for patients and to provide a wealth of information useful for scoring the signals. However, the reliability of signals due to electrode displacement throughout the night remains a challenge [133,134]. Additionally, the cost of EEG-based systems are higher due to the advanced electronic circuitry needed to acquire high-quality signals. As a result, using PPG and accelerometry with wrist-worn devices remain the most usable option in terms of comfort for patients. These sensors are already available in a myriad of consumer devices with integrated software to analyse sleep stages and thus do not interfere with the normal lifestyle of patients. However, due to the inherent sensing limitations, systems using these sensors are unable to identify all five stages of sleep.

3.7. Power Consumption

Power consumption is a major design consideration in wearable systems, particularly for sleep staging, where any device should be capable of running for 8–10 h at least, which is the duration for normal sleep. Unfortunately, there are very few articles that have reported developing specific systems and their power consumption. A single-channel EEG-based integrated circuit for automatic sleep staging developed in [72] reported a power consumption of 575 μ W. Another integrated circuit using EEG and EMG channels in [101] reported significantly lower consumption at 5 μ W. Finally, an integrated system using EEG, EOG, and EMG channels in [150] reported 71 mW as the power consumption of their system. Apart from the latter system, which has a relatively higher power consumption, all are capable of delivering multi-night recordings using small coin cell batteries with a typical capacity of 200–300 mAh.

3.8. Challenges and Future Directions

The rise in consumer focus on sleep awareness as well as an increase in understanding the importance of sleep [159] will lead to many more wearable devices being developed to quantify the different sleep stages and to assess sleep quality. Based on the current trends, it is likely that these future systems will become smaller and easier to use for the patients. From a technical development viewpoint, they are likely to use a maximum of one or two sensing modalities to extract direct or indirect parameters that can be mapped to the sleep stages. However, there are a number of usability, reliability, and regulatory challenges that need to be explored.
It is pertinent to note that, regardless of the sensing modality used, sleep staging is only a subset of full polysomnography and, thus, any wearable being developed for sleep staging cannot replace PSG. It can, however, be used to triage patients and to prioritise access to PSG for those who need it the most. Additionally, it can be useful for long-term sleep monitoring of patients as well as other subjects for research. This will require overcoming usability challenges so that patient compliance over several weeks is high. This, in turn, requires creating systems that do not inconvenience users by requiring them to charge the devices regularly or to negatively impact their comfort during sleep.
Although an increasing number single-sensor devices are developed with improved accuracy, their utility and acceptability in the medical community is limited. For example, the heart rate is used in most commercial fitness trackers for sleep scoring. This is advantageous since the heart signal can be useful as an indicator for other health conditions. However, there are no standard guidelines that define how heart rate changes should be mapped on to the sleep stages. This is true for all signals other than those obtained from the EEG. As a result, such devices are not regulated as a medical device; thus, their diagnostic utility is currently very limited [155]. Future research should tackle this by carrying out large trials to establish the diagnostic utility of non-EEG-based signals for sleep detection and staging.
More and more wearable sensors are being developed to monitor cardiac, respiratory, and other physiological parameters [160,161,162,163]. Despite the shrinking size of sensing technologies and low-power electronic circuitry, one of the limiting factors that remains in PSG is the process of electrode attachment. This is essentially a usability issue that requires more research in the area of flexible electrodes that can be used reliably over a longer period of time. This is important since a wearable multi-channel PSG will provide more diagnostic value than a wearable single-channel sleep staging system.
Finally, already an important area of focus for researchers, processing and classification of sleep signals will continue to grow. This will involve creating more low-complexity and higher accuracy algorithms for signals acquired from the different sensing modalities. However, algorithms developed for similar signals obtained from different sensing modalities may not be compatible with each other [164]. Further, the limited availability of accessible data sets makes it difficult to develop and benchmark such algorithms [165]. Having a wearable system will make it easier to obtain and maintain large data sets of different sleep signals that could be helpful for researchers to assess and compare the accuracy of their algorithms. Since the scope of this paper is limited to sensing modalities, readers interested in a review of algorithmic approaches for automatic sleep staging are referred to [11]. Other deployment challenges of long-term home-based devices include their management, security, updates, and maintenance, as discussed in [166].

4. Conclusions

The limitations of PSG due to its usability and cost has resulted in a number of different sensing modalities being explored for sleep detection and staging. Despite the issues with PSG, it can be concluded that, when a single sensing modality is used, EEG is by far the most popular choice, given that it conforms broadly to the AASM scoring guidelines. There are usability and reliability issues with the use of EEG; however, recent advances in the area of wearable EEG [167] such as ear-EEG and tattoo-like electrodes are likely to help with these and to improve the likelihood of obtaining reliable signals during sleep. When a combination of sensors are used, PPG and accelerometry together are the most popular option. This is partly because these sensors are widely available in convenient wrist-worn consumer devices. However, regardless of the convenience in use, this combination of sensing modality is unable to identify all stages of sleep and its accuracy is also lower compared to EEG. Further, as a result of the low accuracy of commercial systems that use this sensing modality for sleep staging, they are not deemed a suitable alternative to PSG [138]. Thus, it appears that a wearable EEG is more accurate compared to PPG and accelerometry whereas the latter is a more attractive option if usability is more important. Other sensing modalities such as ECG, radar, audio, and pressure sensors are also a more user-friendly option with limitations on what sleep stages they can reliably discern.
Although the results in this paper show a clear trend towards specific sensing modalities, there are certain limitations of this study. For instance, because of the focus on sensing modality, the discussion of the different algorithms and processing requirements is not included. This, however, is important from a system design viewpoint since some of the computational requirements for implementing these algorithms can add additional constraints to the sensing power consumption. Additionally, as with the design of any wearable device, various other factors will have different weights depending on the application and certain trade-offs will be needed. For example, a device designed for medical use will need a higher accuracy with resolution of all the stages of sleep whereas a consumer sleep monitoring device will lean towards usability as the most important factor. Nevertheless, the results presented in this paper can help the designers of wearable sleep detection and staging devices make these informed choices and trade-offs. Understanding the choice of sensor for wearable sleep staging and the limitations on signals obtained from that sensor is still important on its own, and thus, the results of this paper can be useful for designers of wearable devices for sleep detection and staging to make important decisions at early stages of their product development life cycle.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
PSGPolysomnography
EEGElectroencephalography/Electroencephalogram
EOGElectrooculography/Electrooculogram
EMGElectromyography/Electromyogram
ECGElectrocardiography/Electrocardiogram
BCGBallistocardiography/Ballistocardiogram
ACCAccelerometer
RIPRespiratory Inductive Plethysmography
EDAElectrodermal Activity
PPGPhotoplethysmography/Photoplethysmogram
REMRapid Eye Movement
NREMNon-Rapid Eye Movement
R&KRechtschaffen and Kales
AASMAmerican Academy of Sleep Medicine
HBIHeart Beat Interval
RRIR-R Interval
PPIPulse Peak Interval
HRVHeart Rate Variability
PRVPulse Rate Variability
RRVRespiratory Rate Variability
HRHeart Rate
RRRespiratory Rate
PTTPulse Transit Time
EDRECG-Derived Respiration
LEDLight Emitting Diode
IoTInternet of Things
OSAObstructive Sleep Apnoea
SDBSleep Disordered Breathing
PLMSPeriodic Limb Movements of Sleep

References

  1. Sleep SOS Report: The Impact of Sleep on Society; The Sleep Alliance: San Diego, CA, USA, 2007.
  2. Institute of Medicine; Board on Health Sciences Policy; Committee on Sleep Medicine and Research; Colten, H.R.; Altevogt, B.M. (Eds.) Sleep Disorders and Sleep Deprivation: An Unmet Public Health Problem; The National Academies Press: Washington, DC, USA, 2006. [Google Scholar] [CrossRef]
  3. Liu, S.Y.; Perez, M.A.; Lau, N. The impact of sleep disorders on driving safety—Findings from the Second Strategic Highway Research Program naturalistic driving study. Sleep 2018, 41, zsy023. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Garbarino, S.; Lanteri, P.; Durando, P.; Magnavita, N.; Sannita, W. Co-Morbidity, Mortality, Quality of Life and the Healthcare/Welfare/Social Costs of Disordered Sleep: A Rapid Review. Int. J. Environ. Res. Public Health 2016, 13, 831. [Google Scholar] [CrossRef]
  5. Vaughn, B.V.; Giallanza, P. Technical Review of Polysomnography. Chest 2008, 134, 1310–1319. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Iber, C.; Ancoli-Israel, S.; Chesson, A.; Quan, S. The AASM Manual for the Scoring of Sleep and Associated Events: Rules, Terminology and Technical Specifications; American Academy of Sleep Medicine: Westchester, IL, USA, 2007. [Google Scholar]
  7. Rechtschaffen, A.; Kales, A. A Manual of Standardized Terminology, Techniques and Scoring System for Sleep Stages of Human Subjects; Public Health Service, U.S. Government Printing Office: Washington, DC, USA, 1968.
  8. Flemons, W.W.; Douglas, N.J.; Kuna, S.T.; Rodenstein, D.O.; Wheatley, J. Access to Diagnosis and Treatment of Patients with Suspected Sleep Apnea. Am. J. Respir. Crit. Care Med. 2004, 169, 668–672. [Google Scholar] [CrossRef]
  9. Ronzhina, M.; Janoušek, O.; Kolářová, J.; Nováková, M.; Honzík, P.; Provazník, I. Sleep scoring using artificial neural networks. Sleep Med. Rev. 2012, 16, 251–263. [Google Scholar] [CrossRef]
  10. Bruyneel, M.; Van den Broecke, S.; Libert, W.; Ninane, V. Real-time attended home-polysomnography with telematic data transmission. Int. J. Med. Inform. 2013, 82, 696–701. [Google Scholar] [CrossRef]
  11. Boostani, R.; Karimzadeh, F.; Nami, M. A comparative review on sleep stage classification methods in patients and healthy individuals. Comput. Methods Programs Biomed. 2017, 140, 77–91. [Google Scholar] [CrossRef]
  12. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. BMJ 2009, 339, b2535. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Kortelainen, J.M.; Mendez, M.O.; Bianchi, A.M.; Matteucci, M.; Cerutti, S. Sleep Staging Based on Signals Acquired Through Bed Sensor. IEEE Trans. Inf. Technol. Biomed. 2010, 14, 776–785. [Google Scholar] [CrossRef]
  14. Emfit Ltd. Emfit Sleep Tracking & Monitoring. Available online: https://www.emfit.com/ (accessed on 25 January 2021).
  15. Hedner, J.; White, D.; Malhotra, A.; Cohen, S.; Pittman, S.; Zou, D.; Grote, L.; Pillar, G. Sleep Staging Based on Autonomic Signals: A Multi-Center Validation Study. J. Clin. Sleep Med. 2011, 7, 301–306. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  16. Itamar Medical. WatchPAT 100. Available online: https://www.itamar-medical.com/ (accessed on 11 February 2021).
  17. Sloboda, J.; Das, M. A simple sleep stage identification technique for incorporation in inexpensive electronic sleep screening devices. In Proceedings of the 2011 IEEE National Aerospace and Electronics Conference (NAECON), Dayton, OH, USA, 20–22 July 2011; pp. 21–24. [Google Scholar] [CrossRef]
  18. Ichimaru, Y.; Moody, G. Development of the polysomnographic database on CD-ROM. Psychiatry Clin. Neurosci. 1999, 53, 175–177. [Google Scholar] [CrossRef] [Green Version]
  19. Kanady, J.C.; Drummond, S.P.A.; Mednick, S.C. Actigraphic assessment of a polysomnographic-recorded nap: A validation study. J. Sleep Res. 2011, 20, 214–222. [Google Scholar] [CrossRef] [PubMed]
  20. Philips Respironics. Actiwatch. Available online: https://www.philips.co.uk/healthcare/product/HC1044809/actiwatch-2-activity-monitor (accessed on 11 February 2021).
  21. Shambroom, J.R.; FÁbregas, S.E.; Johnstone, J. Validation of an automated wireless system to monitor sleep in healthy adults. J. Sleep Res. 2012, 21, 221–230. [Google Scholar] [CrossRef]
  22. Zhang, J.; Chen, D.; Zhao, J.; He, M.; Wang, Y.; Zhang, Q. RASS: A portable real-time automatic sleep scoring system. In Proceedings of the 2012 IEEE 33rd Real-Time Systems Symposium, San Juan, PR, USA, 4–7 December 2012; pp. 105–114. [Google Scholar] [CrossRef]
  23. Wang, J.S.; Shih, G.R.; Chiang, W.C. Sleep stage classification of sleep apnea patients using decision-tree-based support vector machines based on ECG parameters. In Proceedings of the 2012 IEEE-EMBS International Conference on Biomedical and Health Informatics, Hong Kong, China, 5–7 January 2012; Volume 25, pp. 285–288. [Google Scholar] [CrossRef]
  24. Nakazaki, K.; Kitamura, S.; Motomura, Y.; Hida, A.; Kamei, Y.; Miura, N.; Mishima, K. Validity of an algorithm for determining sleep/wake states using a new actigraph. J. Physiol. Anthropol. 2014, 33, 31. [Google Scholar] [CrossRef] [Green Version]
  25. Kosmadopoulos, A.; Sargent, C.; Darwent, D.; Zhou, X.; Roach, G. Alternatives to polysomnography (PSG): A validation of wrist actigraphy and a partial-PSG system. Behav. Res. Methods 2014, 46, 1032–1041. [Google Scholar] [CrossRef] [PubMed]
  26. Long, X.; Fonseca, P.; Foussier, J.; Haakma, R.; Aarts, R.M. Sleep and Wake Classification With Actigraphy and Respiratory Effort Using Dynamic Warping. IEEE J. Biomed. Health Inform. 2014, 18, 1272–1284. [Google Scholar] [CrossRef]
  27. Domingues, A.; Paiva, T.; Sanches, J.M. Hypnogram and Sleep Parameter Computation From Activity and Cardiovascular Data. IEEE Trans. Biomed. Eng. 2014, 61, 1711–1719. [Google Scholar] [CrossRef]
  28. SOMNO Medics. SOMNOWatch Plus. Available online: https://somnomedics.de/en/solutions/sleep_diagnostics/actigraphy/somnowatch-plus-actigraphy/ (accessed on 11 February 2021).
  29. Imtiaz, S.A.; Rodriguez-Villegas, E. A Low Computational Cost Algorithm for REM Sleep Detection Using Single Channel EEG. Ann. Biomed. Eng. 2014, 42, 2344–2359. [Google Scholar] [CrossRef] [Green Version]
  30. University of MONS—TCTS Laboratory. The DREAMS Subjects Database. Available online: https://zenodo.org/record/2650142#.YA6QV5P7TJ8 (accessed on 20 December 2020).
  31. Aboalayon, K.A.; Ocbagabir, H.T.; Faezipour, M. Efficient sleep stage classification based on EEG signals. In Proceedings of the IEEE Long Island Systems, Applications and Technology (LISAT) Conference 2014, Farmingdale, NY, USA, 2 May 2014. [Google Scholar] [CrossRef]
  32. Kemp, B.; Zwinderman, A.H.; Tuk, B.; Kamphuisen, H.A.C.; Oberye, J.J.L. Analysis of a sleep-dependent neuronal feedback loop: The slow-wave microcontinuity of the EEG. IEEE Trans. Biomed. Eng. 2000, 47, 1185–1194. [Google Scholar] [CrossRef] [PubMed]
  33. Dehkordi, P.; Garde, A.; Karlen, W.; Wensley, D.; Ansermino, J.M.; Dumont, G.A. Sleep stage classification in children using photo plethysmogram pulse rate variability. Comput. Cardiol. 2014 2014, 41, 297–300. [Google Scholar]
  34. Samy, L.; Huang, M.C.; Liu, J.J.; Xu, W.; Sarrafzadeh, M. Unobtrusive sleep stage identification using a pressure-sensitive bed sheet. IEEE Sens. J. 2014, 14, 2092–2101. [Google Scholar] [CrossRef]
  35. Slater, J.; Botsis, T.; Walsh, J.; King, S.; Straker, L.; Eastwood, P. Assessing sleep using hip and wrist actigraphy: Hip and wrist actigraphy. Sleep Biol. Rhythm. 2015, 13. [Google Scholar] [CrossRef] [Green Version]
  36. ActiGraph. GT3X. Available online: https://actigraphcorp.com/support/activity-monitors/gt3x/ (accessed on 21 January 2021).
  37. Fietze, I.; Penzel, T.; Partinen, M.; Sauter, J.; Küchler, G.; Suvoro, A.; Hein, H. Actigraphy combined with EEG compared to polysomnography in sleep apnea patients. Physiol. Meas. 2015, 36, 385–396. [Google Scholar] [CrossRef]
  38. SOMNOmedics. SOMNOscreen. Available online: https://somnomedics.de/en/solutions/sleep_diagnostics/stationary_sleep_lab_psg/somnoscreen-plus/ (accessed on 21 December 2020).
  39. Wu, H.T.; Talmon, R.; Lo, Y.L. Assess Sleep Stage by Modern Signal Processing Techniques. IEEE Trans. Biomed. Eng. 2015, 62, 1159–1168. [Google Scholar] [CrossRef] [Green Version]
  40. Chen, Y.; Zhu, X.; Chen, W. Automatic sleep staging based on ECG signals using hidden Markov models. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 530–533. [Google Scholar] [CrossRef]
  41. Terzano, M.; Parrino, L.; Sherieri, A.; Chervin, R.; Chokroverty, S.; Guilleminault, C.; Hirshkowitz, M.; Mahowald, M.; Moldofsky, H.; Rosa, A.; et al. Atlas, rules, and recording techniques for the scoring of cyclic alternating pattern (CAP) in human sleep. Sleep Med. 2001, 2, 537–553. [Google Scholar] [CrossRef]
  42. Imtiaz, S.; Rodriguez-Villegas, E. Automatic sleep staging using state machine-controlled decision trees. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015. [Google Scholar] [CrossRef] [Green Version]
  43. Wang, Y.; Loparo, K.; Kelly, M.; Kalpan, R. Evaluation of an automated single-channel sleep staging algorithm. Nat. Sci. Sleep 2015, 7, 101–111. [Google Scholar] [CrossRef] [Green Version]
  44. General Sleep Corporation. Zmachine. Available online: https://www.generalsleep.com/zmachine-insight.html (accessed on 20 December 2020).
  45. Hassan, A.R.; Bashar, S.K.; Bhuiyan, M.I.H. On the classification of sleep states by means of statistical and spectral features from single channel Electroencephalogram. In Proceedings of the 2015 International Conference on Advances in Computing, Communications and Informatics (ICACCI), Kochi, India, 10–13 August 2015; pp. 2238–2243. [Google Scholar] [CrossRef]
  46. Tataraidze, A.; Anishchenko, L.; Korostovtseva, L.; Kooij, B.J.; Bochkarev, M.; Sviryaev, Y. Sleep stage classification based on bioradiolocation signals. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 362–365. [Google Scholar] [CrossRef]
  47. Tataraidze, A.; Anishchenko, L.; Korostovtseva, L.; Kooij, B.J.; Bochkarev, M.; Sviryaev, Y. Sleep stage classification based on respiratory signal. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 358–361. [Google Scholar] [CrossRef]
  48. Natus Neurology. Embla N7000. Available online: https://partners.natus.com/ (accessed on 20 December 2020).
  49. Kagawa, M.; Sasaki, N.; Suzumura, K.; Matsui, T. Sleep stage classification by body movement index and respiratory interval indices using multiple radar sensors. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 7606–7609. [Google Scholar] [CrossRef]
  50. Nguyen, A.; Alqurashi, R.; Raghebi, Z.; Banaei-Kashani, F.; Halbower, A.C.; Vu, T. A lightweight and inexpensive in-ear sensing system for automatic whole-night sleep stage monitoring. In Proceedings of the 14th ACM Conference on Embedded Network Sensor Systems CD-ROM; ACM: New York, NY, USA, 2016; pp. 230–244. [Google Scholar] [CrossRef]
  51. Kagawa, M.; Suzumura, K.; Matsui, T. Sleep stage classification by non-contact vital signs indices using Doppler radar sensors. In Proceedings of the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; pp. 4913–4916. [Google Scholar] [CrossRef]
  52. Singh, J.; Sharma, R.K.; Gupta, A.K. A method of REM-NREM sleep distinction using ECG signal for unobtrusive personal monitoring. Comput. Biol. Med. 2016, 78, 138–143. [Google Scholar] [CrossRef]
  53. O’Reilly, C.; Gosselin, N.; Carrier, J.; Nielsen, T. Montreal Archive of Sleep Studies: An open-access resource for instrument benchmarking and exploratory research. J. Sleep Res. 2014, 23, 628–635. [Google Scholar] [CrossRef]
  54. Ye, Y.; Yang, K.; Jiang, J.; Ge, B. Automatic sleep and wake classifier with heart rate and pulse oximetry: Derived dynamic time warping features and logistic model. In Proceedings of the 2016 Annual IEEE Systems Conference (SysCon), Orlando, FL, USA, 18–21 April 2016; pp. 1–6. [Google Scholar] [CrossRef]
  55. Quan, S.; Howard, B.; Iber, C.; Kiley, J.; Nieto, F.; O’Connor, G.; Rapoport, D.; Redline, S.; Robbins, J.; Samet, J.; et al. The Sleep Heart Health Study: Design, Rationale, and Methods. Sleep 1998, 20, 1077–1085. [Google Scholar] [CrossRef] [Green Version]
  56. Dafna, E.; Halevi, M.; Ben Or, D.; Tarasiuk, A.; Zigel, Y. Estimation of macro sleep stages from whole night audio analysis. In Proceedings of the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; pp. 2847–2850. [Google Scholar] [CrossRef]
  57. Solid State Sound. Edirol R-4. Available online: https://www.solidstatesound.co.uk/edirol_r4.htm (accessed on 21 December 2020).
  58. Shahrbabaki, S.S.; Ahmed, B.; Penzel, T.; Cvetkovic, D. Pulse transit time and heart rate variability in sleep staging. In Proceedings of the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; pp. 3469–3472. [Google Scholar] [CrossRef]
  59. Yang, J.; Keller, J.M.; Popescu, M.; Skubic, M. Sleep stage recognition using respiration signal. In Proceedings of the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; pp. 2843–2846. [Google Scholar] [CrossRef]
  60. Lucey, B.P.; Mcleland, J.S.; Toedebusch, C.D.; Boyd, J.; Morris, J.C.; Landsness, E.C.; Yamada, K.; Holtzman, D.M. Comparison of a single-channel EEG sleep study to polysomnography. J. Sleep Res. 2016, 25, 625–635. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  61. Advanced Brain Monitoring. Sleep Profiler. Available online: https://www.advancedbrainmonitoring.com/products/sleep-profiler (accessed on 21 December 2020).
  62. Osterbauer, B.; Koempel, J.; Ward, S.; Fisher, L.; Don, D. A Comparison Study Of The Fitbit Activity Monitor And PSG For Assessing Sleep Patterns And Movement In Children. J. Otolaryngol. Adv. 2016, 1, 24. [Google Scholar] [CrossRef] [Green Version]
  63. Fitbit. Fitbit Activity Trackers. Available online: https://www.fitbit.com/ (accessed on 21 December 2020).
  64. de Zambotti, M.; Baker, F.C.; Willoughby, A.R.; Godino, J.G.; Wing, D.; Patrick, K.; Colrain, I.M. Measures of sleep and cardiac functioning during sleep using a multi-sensory commercially-available wristband in adolescents. Physiol. Behav. 2016, 158, 143–149. [Google Scholar] [CrossRef] [Green Version]
  65. Nakamura, T.; Goverdovsky, V.; Morrell, M.J.; Mandic, D.P. Automatic Sleep Monitoring Using Ear-EEG. IEEE J. Transl. Eng. Health Med. 2017, 5, 1–8. [Google Scholar] [CrossRef] [PubMed]
  66. Fonseca, P.; Den Teuling, N.; Long, X.; Aarts, R.M. Cardiorespiratory Sleep Stage Detection Using Conditional Random Fields. IEEE J. Biomed. Health Inform. 2017, 21, 956–966. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  67. da Silveira, T.; Kozakevicius, A.; Rodrigues, C. Single-channel EEG sleep stage classification based on a streamlined set of statistical features in wavelet domain. Med Biol. Eng. Comput. 2017, 55, 343–352. [Google Scholar] [CrossRef]
  68. Beattie, Z.; Oyang, Y.; Statan, A.; Ghoreyshi, A.; Pantelopoulos, A.; Russell, A.; Heneghan, C. Estimation of sleep stages in a healthy adult population from optical plethysmography and accelerometer signals. Physiol. Meas. 2017, 38, 1968–1979. [Google Scholar] [CrossRef]
  69. Yuda, E.; Yoshida, Y.; Sasanabe, R.; Tanaka, H.; Shiomi, T.; Hayano, J. Sleep stage classification by a combination of actigraphic and heart rate signals. J. Low Power Electron. Appl. 2017, 7, 28. [Google Scholar] [CrossRef] [Green Version]
  70. Dey, J.; Bhowmik, T.; Sahoo, S.; Tiwari, V.N. Wearable PPG sensor based alertness scoring system. In Proceedings of the 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Jeju, Korea, 11–15 July 2017; pp. 2422–2425. [Google Scholar] [CrossRef]
  71. Fonseca, P.; Weysen, T.; Goelema, M.S.; Møst, E.I.; Radha, M.; Lunsingh Scheurleer, C.; van den Heuvel, L.; Aarts, R.M. Validation of Photoplethysmography-Based Sleep Staging Compared With Polysomnography in Healthy Middle-Aged Adults. Sleep 2017, 40, zsx097. [Google Scholar] [CrossRef]
  72. Imtiaz, S.A.; Jiang, Z.; Rodriguez-Villegas, E. An Ultralow Power System on Chip for Automatic Sleep Staging. IEEE J. Solid-State Circuits 2017, 52, 822–833. [Google Scholar] [CrossRef]
  73. Wei, R.; Zhang, X.; Wang, J.; Dang, X. The research of sleep staging based on single-lead electrocardiogram and deep neural network. Biomed. Eng. Lett. 2017, 8, 87–93. [Google Scholar] [CrossRef]
  74. Tal, A.; Shinar, Z.; Shaki, D.; Codish, S.; Goldbart, A. Validation of Contact-Free Sleep Monitoring Device with Comparison to Polysomnography. J. Clin. Sleep Med. 2017, 13. [Google Scholar] [CrossRef] [Green Version]
  75. EarlySense. EarlySense Proactive Patient Care. Available online: https://www.earlysense.com/ (accessed on 21 January 2021).
  76. Dimitriadis, S.I.; Salis, C.; Linden, D. A novel, fast and efficient single-sensor automatic sleep-stage classification based on complementary cross-frequency coupling estimates. Clin. Neurophysiol. 2018, 129, 815–828. [Google Scholar] [CrossRef]
  77. Li, Q.; Li, Q.; Liu, C.; Shashikumar, S.; Nemati, S.; Clifford, G. Deep learning in the cross-time-frequency domain for sleep staging from a single lead electrocardiogram. Physiol. Meas. 2018, 39. [Google Scholar] [CrossRef] [PubMed]
  78. de Zambotti, M.; Goldstone, A.; Claudatos, S.; Colrain, I.M.; Baker, F.C. A validation study of Fitbit Charge 2™ compared with polysomnography in adults. Chronobiol. Int. 2018, 35, 465–476. [Google Scholar] [CrossRef] [PubMed]
  79. Widasari, E.R.; Tanno, K.; Tamura, H. Automatic Sleep Stage Detection Based on HRV Spectrum Analysis. In Proceedings of the 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Miyazaki, Japan, 7–10 October 2018; pp. 869–874. [Google Scholar] [CrossRef]
  80. Zhang, X.; Kou, W.; Chang, E.I.; Gao, H.; Fan, Y.; Xu, Y. Sleep stage classification based on multi-level feature learning and recurrent neural networks via wearable device. Comput. Biol. Med. 2018, 103, 71–81. [Google Scholar] [CrossRef] [Green Version]
  81. Quante, M.; Kaplan, E.; Cailler, M.; Rueschman, M.; Wang, R.; Weng, J.; Taveras, E.; Redline, S. Actigraphy-based sleep estimation in adolescents and adults: A comparison with polysomnography using two scoring algorithms. Nat. Sci. Sleep 2018, 10, 13–20. [Google Scholar] [CrossRef] [Green Version]
  82. Lesmana, T.F.; Isa, S.M.; Surantha, N. Sleep Stage Identification Using the Combination of ELM and PSO Based on ECG Signal and HRV. In Proceedings of the 2018 3rd International Conference on Computer and Communication Systems (ICCCS), Nagoya, Japan, 27–30 April 2018; pp. 436–439. [Google Scholar] [CrossRef]
  83. Yoon, H.; Hwang, S.H.; Choi, J.W.; Lee, Y.J.; Jeong, D.U.; Park, K.S. Slow-Wave Sleep Estimation for Healthy Subjects and OSA Patients Using R-R Intervals. IEEE J. Biomed. Health Inform. 2018, 22, 119–128. [Google Scholar] [CrossRef]
  84. Pigeon, W.R.; Taylor, M.; Bui, A.; Oleynk, C.; Walsh, P.; Bishop, T.M. Validation of the sleep-wake scoring of a new wrist-worn sleep monitoring device. J. Clin. Sleep Med. 2018, 14, 1057–1062. [Google Scholar] [CrossRef]
  85. Palotti, J.; Mall, R.; Aupetit, M.; Rueschman, M.; Singh, M.; Sathyanarayana, A.; Taheri, S.; Fernandez-Luque, L. Benchmark on a large cohort for sleep-wake classification with machine learning techniques. NPJ Digit. Med. 2019, 2, 50. [Google Scholar] [CrossRef] [Green Version]
  86. Chen, X.; Wang, R.; Zee, P.; Lutsey, P.L.; Javaheri, S.; Alcántara, C.; Jackson, C.L.; Williams, M.A.; Redline, S. Racial/Ethnic Differences in Sleep Disturbances: The Multi-Ethnic Study of Atherosclerosis (MESA). Sleep 2015, 38, 877–888. [Google Scholar] [CrossRef] [PubMed]
  87. Herlan, A.; Ottenbacher, J.; Schneider, J.; Riemann, D.; Feige, B. Electrodermal activity patterns in sleep stages and their utility for sleep versus wake classification. J. Sleep Res. 2019, 28, e12694. [Google Scholar] [CrossRef] [PubMed]
  88. Wei, Y.; Qi, X.; Wang, H.; Liu, Z.; Wang, G.; Yan, X. A Multi-Class Automatic Sleep Staging Method Based on Long Short-Term Memory Network Using Single-Lead Electrocardiogram Signals. IEEE Access 2019, 7, 85959–85970. [Google Scholar] [CrossRef]
  89. An, P.; Si, W.; Ding, S.; Xue, G.; Yuan, Z. A novel EEG sleep staging method for wearable devices based on amplitude-time mapping. In Proceedings of the 2019 IEEE 4th International Conference on Advanced Robotics and Mechatronics (ICARM), Toyonaka, Japan, 3–5 July 2019; pp. 124–129. [Google Scholar] [CrossRef]
  90. Shen, H.; Xu, M.; Guez, A.; Li, A.; Ran, F. An accurate sleep stages classification method based on state space model. IEEE Access 2019, 7, 125268–125279. [Google Scholar] [CrossRef]
  91. Cho, T.; Sunarya, U.; Yeo, M.; Hwang, B.; Koo, Y.S.; Park, C. Deep-ACTINet: End-to-end deep learning architecture for automatic sleep-wake detection using wrist actigraphy. Electronics 2019, 8, 1461. [Google Scholar] [CrossRef] [Green Version]
  92. Yi, R.; Enayati, M.; Keller, J.M.; Popescu, M.; Skubic, M. Non-invasive in-home sleep stage classification using a ballistocardiography bed sensor. In Proceedings of the 2019 IEEE EMBS International Conference on Biomedical & Health Informatics (BHI), Chicago, IL, USA, 19–22 May 2019; pp. 23–26. [Google Scholar] [CrossRef]
  93. Koushik, A.; Amores, J.; Maes, P. Real-time Smartphone-based Sleep Staging using 1-Channel EEG. In Proceedings of the 2019 IEEE 16th International Conference on Wearable and Implantable Body Sensor Networks (BSN), Chicago, IL, USA, 19–22 May 2019; pp. 2–5. [Google Scholar] [CrossRef]
  94. Zhang, Y.; Yang, Z.; Lan, K.; Liu, X.; Zhang, Z.; Li, P.; Cao, D.; Zheng, J.; Pan, J. Sleep stage classification using bidirectional lstm in wearable multi-sensor systems. arXiv 2019, arXiv:1909.11141. [Google Scholar]
  95. Walch, O.; Huang, Y.; Forger, D.; Goldstein, C. Sleep stage prediction with raw acceleration and photoplethysmography heart rate data derived from a consumer wearable device. Sleep 2019, 42, zsz180. [Google Scholar] [CrossRef]
  96. Apple. Apple Watch. Available online: https://www.apple.com/uk/watch/ (accessed on 21 January 2021).
  97. Haghayegh, S.; Khoshnevis, S.; Smolensky, M.H.; Diller, K.R.; Castriotta, R.J. Performance comparison of different interpretative algorithms utilized to derive sleep parameters from wrist actigraphy data. Chronobiol. Int. 2019, 36, 1752–1760. [Google Scholar] [CrossRef] [PubMed]
  98. Aerobtec. Motion Logger. Available online: https://aerobtec.com/motion-logger/ (accessed on 21 January 2021).
  99. Fedorin, I.; Slyusarenko, K.; Lee, W.; Sakhnenko, N. Sleep stages classification in a healthy people based on optical plethysmography and accelerometer signals via wearable devices. In Proceedings of the 2019 IEEE 2nd Ukraine Conference on Electrical and Computer Engineering (UKRCON), Lviv, Ukraine, 2–6 July 2019; pp. 1201–1204. [Google Scholar] [CrossRef]
  100. Samsung. Smartwatches & Fitness Trackers. Available online: https://www.samsung.com/uk/watches/all-watches/ (accessed on 21 January 2021).
  101. Chang, S.; Wu, B.; Liou, Y.; Zheng, R.; Lee, P.; Chiueh, T.; Liu, T. An Ultra-Low-Power Dual-Mode Automatic Sleep Staging Processor Using Neural-Network-Based Decision Tree. IEEE Trans. Circuits Syst. I Regul. Pap. 2019, 66, 3504–3516. [Google Scholar] [CrossRef]
  102. Mikkelsen, K.B.; Ebajemito, J.K.; Bonmati-Carrion, M.A.; Santhi, N.; Revell, V.L.; Atzori, G.; della Monica, C.; Debener, S.; Dijk, D.J.; Sterr, A.; et al. Machine-learning-derived sleep–wake staging from around-the-ear electroencephalogram outperforms manual scoring and actigraphy. J. Sleep Res. 2019, 28, e12786. [Google Scholar] [CrossRef]
  103. Miller, D.J.; Lastella, M.; Scanlan, A.T.; Bellenger, C.; Halson, S.L.; Roach, G.D.; Sargent, C. A validation study of the WHOOP strap against polysomnography to assess sleep. J. Sports Sci. 2020, 38, 2631–2636. [Google Scholar] [CrossRef] [PubMed]
  104. WHOOP. WHOOP Strap. Available online: https://www.whoop.com/ (accessed on 21 January 2021).
  105. Jørgensen, S.D.; Zibrandtsen, I.C.; Kjaer, T.W. Ear-EEG-based sleep scoring in epilepsy: A comparison with scalp-EEG. J. Sleep Res. 2019, 29, e12921. [Google Scholar] [CrossRef] [PubMed]
  106. Zaffaroni, A.; Coffey, S.; Dodd, S.; Kilroy, H.; Lyon, G.; O’Rourke, D.; Lederer, K.; Fietze, I.; Penzel, T. Sleep Staging Monitoring Based on Sonar Smartphone Technology. In Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; pp. 2230–2233. [Google Scholar] [CrossRef]
  107. Kalkbrenner, C.; Brucher, R.; Kesztyüs, T.; Eichenlaub, M.; Rottbauer, W.; Scharnbeck, D. Automated sleep stage classification based on tracheal body sound and actigraphy. Ger. Med. Sci. GMS e-J. 2019, 17. [Google Scholar] [CrossRef]
  108. Nakamura, T.; Alqurashi, Y.D.; Morrell, M.J.; Mandic, D.P. Hearables: Automatic Overnight Sleep Monitoring with Standardized In-Ear EEG Sensor. IEEE Trans. Biomed. Eng. 2020, 67, 203–212. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  109. Abou Jaoude, M.; Sun, H.; Pellerin, K.R.; Pavlova, M.; Sarkis, R.A.; Cash, S.S.; Westover, M.B.; Lam, A.D. Expert-level automated sleep staging of long-term scalp electroencephalography recordings using deep learning. Sleep 2020, 43, zsaa112. [Google Scholar] [CrossRef]
  110. Rosen, C.L.; Auckley, D.; Benca, R.; Foldvary-Schaefer, N.; Iber, C.; Kapur, V.; Rueschman, M.; Zee, P.; Redline, S. A Multisite Randomized Trial of Portable Sleep Studies and Positive Airway Pressure Autotitration Versus Laboratory-Based Polysomnography for the Diagnosis and Treatment of Obstructive Sleep Apnea: The HomePAP Study. Sleep 2012, 35, 757–767. [Google Scholar] [CrossRef]
  111. Xue, B.; Deng, B.; Hong, H.; Wang, Z.; Zhu, X.; Feng, D.D. Non-Contact Sleep Stage Detection Using Canonical Correlation Analysis of Respiratory Sound. IEEE J. Biomed. Health Inform. 2020, 24, 614–625. [Google Scholar] [CrossRef] [PubMed]
  112. Haghayegh, S.; Khoshnevis, S.; Smolensky, M.H.; Diller, K.R.; Castriotta, R.J. Performance assessment of new-generation Fitbit technology in deriving sleep parameters and stages. Chronobiol. Int. 2020, 37, 47–59. [Google Scholar] [CrossRef] [PubMed]
  113. Lauteslager, T.; Kampakis, S.; Williams, A.J.; Maslik, M.; Siddiqui, F. Performance Evaluation of the Circadia Contactless Breathing Monitor and Sleep Analysis Algorithm for Sleep Stage Classification. In Proceedings of the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada, 20–24 July 2020; pp. 5150–5153. [Google Scholar] [CrossRef]
  114. Circadia. C100 Contactless System. Available online: https://circadia.health/ (accessed on 21 January 2021).
  115. Cheung, J.; Leary, E.B.; Lu, H.; Zeitzer, J.M.; Mignot, E. PSG Validation of minute-to-minute scoring for sleep and wake periods in a consumer wearable device. PLoS ONE 2020, 15, e0238464. [Google Scholar] [CrossRef] [PubMed]
  116. Gasmi, A.; Augusto, V.; Beaudet, P.A.; Faucheu, J.; Morin, C.; Serpaggi, X.; Vassel, F. Sleep stages classification using cardio-respiratory variables. In Proceedings of the 2020 IEEE 16th International Conference on Automation Science and Engineering (CASE), Hong Kong, China, 20–21 August 2020; pp. 1031–1036. [Google Scholar] [CrossRef]
  117. Kanady, J.C.; Ruoff, L.; Straus, L.D.; Varbel, J.; Metzler, T.; Richards, A.; Inslicht, S.S.; O’Donovan, A.; Hlavin, J.; Neylan, T.C. Validation of sleep measurement in a multisensor consumer grade wearable device in healthy young adults. J. Clin. Sleep Med. 2020, 16, 917–924. [Google Scholar] [CrossRef]
  118. Haghayegh, S.; Khoshnevis, S.; Smolensky, M.H.; Diller, K.R. Application of deep learning to improve sleep scoring of wrist actigraphy. Sleep Med. 2020, 74, 235–241. [Google Scholar] [CrossRef]
  119. Roberts, D.M.; Schade, M.M.; Mathew, G.M.; Gartenberg, D.; Buxton, O.M. Detecting sleep using heart rate and motion data from multisensor consumer-grade wearables, relative to wrist actigraphy and polysomnography. Sleep 2020, 43. [Google Scholar] [CrossRef]
  120. Oura. Oura Ring. Available online: https://ouraring.com/ (accessed on 21 January 2021).
  121. Korkalainen, H.; Aakko, J.; Duce, B.; Kainulainen, S.; Leino, A.; Nikkonen, S.; Afara, I.O.; Myllymaa, S.; Töyräs, J.; Leppänen, T. Deep learning enables sleep staging from photoplethysmogram for patients with suspected sleep apnea. Sleep 2020, 43, zsaa098. [Google Scholar] [CrossRef] [PubMed]
  122. Nonin. Xpod. Available online: https://www.nonin.com/products/xpod/ (accessed on 11 February 2021).
  123. Motin, M.; Karmakar, C.; Palaniswami, M.; Penzel, T. PPG based automated sleep-wake classification using support vector machine. Physiol. Meas. 2020, 41. [Google Scholar] [CrossRef]
  124. Fonseca, P.; van Gilst, M.M.; Radha, M.; Ross, M.; Moreau, A.; Cerny, A.; Anderer, P.; Long, X.; van Dijk, J.P.; Overeem, S. Automatic sleep staging using heart rate variability, body movements, and recurrent neural networks in a sleep disordered population. Sleep 2020, 43, zsaa048. [Google Scholar] [CrossRef] [PubMed]
  125. van Gilst, M.M.; van Dijk, J.P.; Krijn, R.; Hoondert, B.; Fonseca, P.; van Sloun, R.J.G.; Arsenali, B.; Vandenbussche, N.; Pillen, S.; Maass, H.; et al. Protocol of the SOMNIA project: An observational study to create a neurophysiological database for advanced clinical sleep monitoring. BMJ Open 2019, 9. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  126. Li, X.; Zhang, Y.; Jiang, F.; Zhao, H. A novel machine learning unsupervised algorithm for sleep/wake identification using actigraphy. Chronobiol. Int. 2020, 37, 1002–1015. [Google Scholar] [CrossRef]
  127. Sridhar, N.; Shoeb, A.; Stephens, P.; Kharbouch, A.; Shimol, D.; Burkart, J.; Ghoreyshi, A.; Myers, L. Deep learning for automated sleep staging using instantaneous heart rate. NPJ Digit. Med. 2020, 3, 106. [Google Scholar] [CrossRef]
  128. Kuula, L.; Pesonen, A.K. Heart Rate Variability and Firstbeat Method for Detecting Sleep Stages in Healthy Young Adults: Feasibility Study. JMIR Mhealth Uhealth 2021, 9, e24704. [Google Scholar] [CrossRef]
  129. Firstbeat Technologies, Oy. Firstbeat Bodyguard. Available online: https://www.firstbeat.com/en/ (accessed on 11 February 2021).
  130. Devine, J.K.; Chinoy, E.D.; Markwald, R.R.; Schwartz, L.P.; Hursh, S.R. Validation of Zulu Watch against Polysomnography and Actigraphy for On-Wrist Sleep-Wake Determination and Sleep-Depth Estimation. Sensors 2021, 21, 76. [Google Scholar] [CrossRef] [PubMed]
  131. Scott, H.; Lovato, N.; Lack, L. The Development and Accuracy of the THIM Wearable Device for Estimating Sleep and Wakefulness. Nat. Sci. Sleep 2021, 13, 39–53. [Google Scholar] [CrossRef]
  132. Re-Time Pty Ltd. THIM. Available online: https://thim.io/ (accessed on 11 February 2021).
  133. Griessenberger, H.; Heib, D.; Kunz, A.; Hoedlmoser, K.; Schabus, M. Assessment of a wireless headband for automatic sleep scoring. Sleep Breath. 2012, 17, 747–752. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  134. Myllymaa, S.; Muraja-Murro, A.; Westeren-Punnonen, S.; Hukkanen, T.; Lappalainen, R.; Mervaala, E.; Töyräs, J.; Sipilä, K.; Myllymaa, K. Assessment of the suitability of using a forehead EEG electrode set and chin EMG electrodes for sleep staging in polysomnography. J. Sleep Res. 2016, 25, 636–645. [Google Scholar] [CrossRef] [PubMed]
  135. Casson, A.J.; Smith, S.; Duncan, J.S.; Rodriguez-Villegas, E. Wearable EEG: What is it, why is it needed and what does it entail? In Proceedings of the 2008 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vancouver, BC, Canada, 20–25 August 2008; pp. 5867–5870. [Google Scholar] [CrossRef]
  136. Ako, M.; Kawara, T.; Uchida, S.; Miyazaki, S.; Nishihara, K.; Mukai, J.; Hirao, K.; Ako, J.; Okubo, Y. Correlation between electroencephalography and heart rate variability during sleep. Psychiatry Clin. Neurosci. 2003, 57, 59–65. [Google Scholar] [CrossRef]
  137. Allen, J. Photoplethysmography and its application in clinical physiological measurement. Physiol. Meas. 2007, 28, R1–R39. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  138. Haghayegh, S.; Khoshnevis, S.; Smolensky, M.H.; Diller, K.R.; Castriotta, R.J. Accuracy of Wristband Fitbit Models in Assessing Sleep: Systematic Review and Meta-Analysis. J. Med. Internet Res. 2019, 21, e16273. [Google Scholar] [CrossRef] [PubMed]
  139. Penzel, T.; Kantelhardt, J.; Lo, C.C.; Voigt, K.; Vogelmeier, C. Dynamics of Heart Rate and Sleep Stages in Normals and Patients with Sleep Apnea. Neuropsychopharmacology 2003, 28 (Suppl. 1), S48–S53. [Google Scholar] [CrossRef] [PubMed]
  140. Gutierrez, G.; Williams, J.; Alrehaili, G.A.; McLean, A.; Pirouz, R.; Amdur, R.; Jain, V.; Ahari, J.; Bawa, A.; Kimbro, S. Respiratory rate variability in sleeping adults without obstructive sleep apnea. Physiol. Rep. 2016, 4, e12949. [Google Scholar] [CrossRef]
  141. Hakim, A. Wrist Actigraphy. Chest 2011, 139, 1514–1527. [Google Scholar] [CrossRef]
  142. Kaplan, V.; Zhang, J.; Russi, E.; Bloch, K. Detection of inspiratory flow limitation during sleep by computer assisted respiratory inductive plethysmography. Eur. Respir. J. 2000, 15, 570–578. [Google Scholar] [CrossRef] [PubMed]
  143. Chang, W.Y.; Huang, C.C.; Chen, C.C.; Chang, C.C.; Yang, C.L. Design of a Novel Flexible Capacitive Sensing Mattress for Monitoring Sleeping Respiratory. Sensors 2014, 14, 22021–22038. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  144. Park, S.W.; Das, P.S.; Chhetry, A.; Park, J.Y. A Flexible Capacitive Pressure Sensor for Wearable Respiration Monitoring System. IEEE Sens. J. 2017, 17, 6558–6564. [Google Scholar] [CrossRef]
  145. Costanzo, S. Software-Defined Doppler Radar Sensor for Human Breathing Detection. Sensors 2019, 19, 3085. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  146. Dafna, E.; Rosenwein, T.; Tarasiuk, A.; Zigel, Y. Breathing rate estimation during sleep using audio signal analysis. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 5981–5984. [Google Scholar] [CrossRef]
  147. Mcnicholas, W.T.; Coffey, M.; Boyle, T. Effects of Nasal Airflow on Breathing during Sleep in Normal Humans. Am. Rev. Respir. Dis. 1993, 147, 620–623. [Google Scholar] [CrossRef] [PubMed]
  148. Sano, A.; Picard, R.W.; Stickgold, R. Quantitative analysis of wrist electrodermal activity during sleep. Int. J. Psychophysiol. 2014, 94, 382–389. [Google Scholar] [CrossRef]
  149. Morgenthaler, T.; Alessi, C.; Friedman, L.; Owens, J.; Kapur, V.; Boehlecke, B.; Brown, T.; Coleman, J.; Lee-Chiong, T.; Pancer, J.; et al. Practice Parameters for the Use of Actigraphy in the Assessment of Sleep and Sleep Disorders: An Update for 2007. Sleep 2007, 30, 519–529. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  150. Kim, S.W.; Lee, K.; Yeom, J.; Lee, T.H.; Kim, D.H.; Kim, J.J. Wearable Multi-Biosignal Analysis Integrated Interface with Direct Sleep-Stage Classification. IEEE Access 2020, 8, 46131–46140. [Google Scholar] [CrossRef]
  151. Estrada, E.; Nazeran, H.; Barragan, J.; Burk, J.R.; Lucas, E.A.; Behbehani, K. EOG and EMG: Two Important Switches in Automatic Sleep Stage Classification. In Proceedings of the 2006 International Conference of the IEEE Engineering in Medicine and Biology Society, New York, NY, USA, 30 August–3 September 2006; pp. 2458–2461. [Google Scholar] [CrossRef]
  152. Imtiaz, S.A.; Rodriguez-Villegas, E. Recommendations for performance assessment of automatic sleep staging algorithms. In Proceedings of the 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Chicago, IL, USA, 26–30 August 2014; pp. 5044–5047. [Google Scholar] [CrossRef] [Green Version]
  153. Svensson, T.; il Chung, U.; Tokuno, S.; Nakamura, M.; Svensson, A.K. A validation study of a consumer wearable sleep tracker compared to a portable EEG system in naturalistic conditions. J. Psychosom. Res. 2019, 126, 109822. [Google Scholar] [CrossRef]
  154. Rapoport, D.M. Non-Contact Sleep Monitoring: Are We There Yet? J. Clin. Sleep Med. 2019, 15, 935–936. [Google Scholar] [CrossRef] [PubMed]
  155. Danzig, R.; Wang, M.; Shah, A.; Trotti, L.M. The wrist is not the brain: Estimation of sleep by clinical and consumer wearable actigraphy devices is impacted by multiple patient- and device-specific factors. J. Sleep Res. 2020, 29, e12926. [Google Scholar] [CrossRef] [PubMed]
  156. Moon, N.W.; Baker, P.M.; Goughnour, K. Designing wearable technologies for users with disabilities: Accessibility, usability, and connectivity factors. J. Rehabil. Assist. Technol. Eng. 2019, 6, 2055668319862137. [Google Scholar] [CrossRef] [Green Version]
  157. Liang, J.; Xian, D.; Liu, X.; Fu, J.; Zhang, X.; Tang, B.; Lei, J. Usability Study of Mainstream Wearable Fitness Devices: Feature Analysis and System Usability Scale Evaluation. JMIR Mhealth Uhealth 2018, 6, e11066. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  158. Rodriguez-Villegas, E.; Iranmanesh, S.; Imtiaz, S.A. Wearable Medical Devices: High-Level System Design Considerations and Tradeoffs. IEEE Solid-State Circuits Mag. 2018, 10, 43–52. [Google Scholar] [CrossRef]
  159. Walker, M. Why We Sleep: Unlocking the Power of Sleep and Dreams; Scribner: New York, NY, USA, 2017. [Google Scholar]
  160. Blanco-Almazán, D.; Groenendaal, W.; Catthoor, F.; Jané, R. Wearable Bioimpedance Measurement for Respiratory Monitoring During Inspiratory Loading. IEEE Access 2019, 7, 89487–89496. [Google Scholar] [CrossRef]
  161. Elfaramawy, T.; Fall, C.L.; Arab, S.; Morissette, M.; Lellouche, F.; Gosselin, B. A Wireless Respiratory Monitoring System Using a Wearable Patch Sensor Network. IEEE Sens. J. 2019, 19, 650–657. [Google Scholar] [CrossRef]
  162. Sharma, P.; Imtiaz, A.; Rodriguez-Villegas, E. Acoustic Sensing as a Novel Wearable Approach for Cardiac Monitoring at the Wrist. Sci. Rep. 2019, 9, 20079. [Google Scholar] [CrossRef]
  163. Lo Presti, D.; Massaroni, C.; D’Abbraccio, J.; Massari, L.; Caponero, M.; Longo, U.G.; Formica, D.; Oddo, C.M.; Schena, E. Wearable System Based on Flexible FBG for Respiratory and Cardiac Monitoring. IEEE Sens. J. 2019, 19, 7391–7398. [Google Scholar] [CrossRef]
  164. van Gilst, M.M.; Wulterkens, B.M.; Fonseca, P.; Radha, M.; Ross, M.; Moreau, A.; Cerny, A.; Anderer, P.; Long, X.; van Dijk, J.P.; et al. Direct application of an ECG-based sleep staging algorithm on reflective photoplethysmography data decreases performance. BMC Res. Notes 2020, 13. [Google Scholar] [CrossRef]
  165. Chriskos, P.; Frantzidis, C.A.; Nday, C.M.; Gkivogkli, P.T.; Bamidis, P.D.; Kourtidou-Papadeli, C. A review on current trends in automatic sleep staging through bio-signal recordings and future challenges. Sleep Med. Rev. 2021, 55, 101377. [Google Scholar] [CrossRef]
  166. de Zambotti, M.; Cellini, N.; Goldstone, A.; Colrain, I.M.; Baker, F.C. Wearable Sleep Technology in Clinical and Research Settings. Med. Sci. Sport. Exerc. 2019, 51, 1538–1557. [Google Scholar] [CrossRef]
  167. Casson, A. Wearable EEG and beyond. Biomed. Eng. Lett. 2019, 9, 53–71. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Study selection flow diagram.
Figure 1. Study selection flow diagram.
Sensors 21 01562 g001
Figure 2. Number of papers published each year since 2010 related to the topic of wearable sleep staging.
Figure 2. Number of papers published each year since 2010 related to the topic of wearable sleep staging.
Sensors 21 01562 g002
Table 1. Summary of articles included in the review.
Table 1. Summary of articles included in the review.
RefSensing ModalitySignals/FeaturesSleep Stages ClassifiedAccuracyKappaValidationSubjectsAgeLocationData Source
[13]Pressure sensorsHBI + MovementWake-NREM-REM79%0.44PSG920–54HospitalEmfit [14]
[15]ACC + PPGHR + MovementWake-REM-Light-Deep65.3%0.48PSG22736–63Sleep LabWatchPAT 100 [16]
[17]RIP + Nasal airflowRespiratory effortWake-NREM-REM70%-PSG1632–56HospitalMIT-BIH [18]
[19]ACCMovementWake-Sleep86.2%0.65PSG3017-24Research LabActiwatch-64 [20]
[21]EEG1 channelWake-REM-Light-Deep81.1%0.7PSG2619–60Sleep LabZeo
[22]PPG + ACCHRV + MovementWake-NREM-REM75%-PSG4822–71HospitalResearch Device
[23]ECGHRVWake-NREM-REM73.50%-PSG7 (OSA)42–68HospitalUnknown
[24]ACCHip MovementWake-N1-N2-N3-REM88.4%-PSG3420–24Sleep LabFS-750
[25]ACCMovementWake-Sleep88.0%0.3PSG2220-28Sleep LabActiwatch-64 [20]
[26]RIP + ACCEffort + MovementWake-Sleep95.7%0.66PSG1523–58Sleep LabActiwatch [20]
[27]RIP + ECG + ACCRRI + MovementWake-NREM-REM78.3%0.58PSG2032–52Sleep LabACT/Somnowatch [28]
[29]EEG1 channelNREM-REM83%0.61PSG2020–65HospitalDREAMS [30]
[31]EEG1 channelWake-N192.50%0.63PSG13-HospitalSleep EDF Expanded [32]
[33]PPGPRVWake-NREM-REM77–80%-PSG146 (SDB)9–14HospitalUnknown
[34]Pressure sensorsRR + MovementWake-NREM-REM70.30%0.45PSG721–60-Research Device
[35]ACCHip MovementWake-Sleep86.2%-PSG10821–24Research LabGT3X+ [36]
[37]ACC + EEGMovement + 1 channelWake-REM-Light-Deep74.2%-PSG3018–80Sleep LabSomnoScreen [38]
[39]EEG + RIPRespiratory effortWake-N1-N2-N3-REM81.70%-PSG--HospitalUnknown
[40]ECGHRWake-REM-Light-Deep76–85%-PSG1528–39HospitalCAP Sleep Database [41]
[42]EEG1 channelWake-N1-N2-N3-REM79%0.69PSG3925–101HospitalSleep EDF Expanded [32]
[43]EEG1 channelWake-REM-Light-Deep-0.72PSG9918–60Sleep LabZmachine [44]
[45]EEG1 channelWake-N1-N2-N3-REM86.20%0.78PSG821–35HospitalSleep EDF Database [32]
[46]Radar sensorsRespiratory movementWake-NREM-REM75%0.56PSG29 (SDB)22–67HospitalResearch Device
[47]RIPRespiratory effortWake-NREM-REM80%0.65PSG29 (SDB)22–67HospitalEmbla N7000 [48]
[49]Radar sensorsRR + MovementREM-Light-Deep79.30%-PSG1121–23Research LabResearch Device
[50]EEG + EOG + EMGEar-EEGWake-N1-N2-N3-REM95%-PSG8avg. 25Sleep LabResearch Device
[51]Radar sensorsHRV + MovementWake-Sleep66.4%-PSG1021–24Research LabResearch Device
[52]ECGRRINREM-REM76.20%0.52PSG2022–33HospitalMASS PSG Database [53]
[54]ECG + PPGHR + OximetryWake-Sleep83.80%-PSG100-HomeSHHS Database [55]
[56]AudioRespiratory featuresWake-NREM-REM75%0.42PSG2023–70HospitalEDIROL R-4 [57]
[58]ECG + PPGHRV + PTTWake-N1-N2-N3-REM73.40%-PSG20 (Insomnia)-HospitalSOMNOScreen [38]
[59]Nasal airflowRRWake-NREM-REM74%0.49PSG2025–34HospitalSleep EDF Expanded [32]
[60]EEG1 channelWake-N1-(N2+N3)-REM90%0.67PSG29-Research LabSleep Profiler [61]
[62]ACCMovementWake-Sleep--PSG143–11HospitalFitbit Flex [63]
[64]ACC + PPGHR + MovementWake-Sleep90.90%-PSG3214–20Sleep LabFitbit ChargeHR [63]
[65]EEGEar-EEGWake-N1-N2-N376.80%0.64EEG425–36Research LabResearch Device
[66]ECG + RIPRRI + Respiratory effortWake-REM-Light-Deep87.40%0.41PSG18020–95HospitalMultiple Databases
[67]EEG1 channelWake-S1-S2-S3-S4-REM90.5%0.8PSG2025–34HospitalSleep EDF Expanded
[68]PPG + ACCHRV + MovementWake-REM-Light-Deep69%0.52PSG6024–44HomeFitbit Surge [63]
[69]ECG + ACCHR + MovementWake-NREM-REM75%0.49PSG289 (Various)37–65HospitalUnknown
[70]PPGHRVWake-Sleep80.10%-PSG--HospitalMultiple Databases
[71]ACC + PPGHRV + MovementWake-N1-(N2+N3)-REM59.3%0.42PSG5141–60HomeAlice PDx
[72]EEG1 channelWake-N1-N2-N3-REM79%0.59PSG2020–65HospitalDREAMS Subjects [30]
[73]EEG1 channelWake-NREM-REM77%0.56PSG1632–56HospitalMIT-BIH [18]
[74]Pressure sensorsHR + RR + MovementWake-REM-Light-Deep64%0.46PSG6617–72Sleep Lab/HomeEarlySense [75]
[76]EEG2 channelsWake-N1-N2-N3-REM94%-PSG2025–34HospitalSleep EDF Expanded [32]
[77]ECGHRV + EDRWake-REM-Light-Deep75.4%0.54PSG1632–56HospitalMIT-BIH [18]
[78]PPG + ACCHRV + MovementWake-REM-Light-Deep49–81%-PSG44 (PLMS)19–61Sleep LabFitbit Charge 2 [63]
[79]ECGHRVWake-REM-Light-Deep89.20%-PSG3295-HomeSHHS Database [55]
[80]PPG + ACCHR + MovementWake-N1-N2-N3-REM66.60%-PSG3919–64HospitalMicrosoft Band I
[81]ACCMovementWake-Sleep85.0%-PSG2220–45HomeGT3X+ [36]
[82]ECGHRVWake-REM-Light-Deep71.50%-PSG1632–56HospitalMIT-BIH [18]
[83]ECGRRIN390%0.56PSG45 (OSA)-HospitalNI DAQ 6221
[84]ACCMovementWake-Sleep91%0.67PSG2718–64Sleep LabmyCadian
[85]ACCMovementWake-Sleep83%-PSG1817-Sleep LabMESA Sleep Dataset [86]
[87]EDA-Wake-Sleep86.0%-PSG91-HospitalResearch Device
[88]ECGHRV + RRVWake-N1-N2-N3-REM71.16%0.52PSG373 (Various)22–56HospitalSOLAR 3000B
[89]EEG1 channelWake-S1-S2-S3-S4-REM93.60%0.87PSG4-HospitalSleep EDF Expanded [32]
[90]EEG1 channelWake-N1-N2-N3-REM81–92%0.89PSG4820–65HospitalSleep EDF + DREAMS [30,32]
[91]ACCMovementWake-Sleep89.65%-Sleep Diary10-HomeGT3X [36]
[92]Pressure sensorsHRV + RRVWake-NREM-REM85%0.74PSG563–69Sleep LabResearch Device
[93]EEG1 channelWake-N1-N2-N3-REM83.50%-PSG20-HospitalSleep EDF Database [32]
[94]ECG + ACC + PPGRRI + HRV + MovementWake-REM-Light-Deep81%0.69PSG3222–45HospitalSensEcho
[95]PPG + ACCHR + MovementWake-NREM-REM72%0.27PSG3119–55HospitalApple Watch (2,3) [96]
[97]ACCMovementWake-Sleep85.4%0.54EEG4018–40HomeMotion Logger [98]
[99]PPG + ACCHRV + RR + MovementWake-REM-Light-Deep77%0.67PSG50--Samsung Smartwatch [100]
[101]EEG + EMG1 channel + chin movementWake-N1-N2-N3-REM81%-PSG4936–52Sleep LabNTHU Database
[102]EEGEar-EEGWake-N1-N2-N3-REM72.0%0.59PSG1518–63Research LabResearch Device
[103]PPG + ACCHR + MovementWake-REM-Light-Deep64%0.47PSG1219–27Research LabWHOOP strap [104]
[105]EEGEar-EEGWake-N1-N2-N3-REM81%0.74PSG1318–60HospitalResearch Device
[106]SonarRR + MovementWake-REM-Light-Deep60%0.39PSG6231–63Sleep LabSmartphone
[107]Audio + ACCRR + HRV + MovementWake-REM-Light-Deep57%0.36PSG53 (OSA)43–72HospitalResearch Device
[108]EEGEar-EEGWake-N1-N2-N3-REM74.10%0.61PSG2219–29HomeResearch Device
[109]EEG4 channelsWake-N1-N2-N3-REM77.3%0.69PSG243>18HospitalHomePAP Database [110]
[111]AudioBreathing/snoring soundsWake-NREM-REM75%0.42PSG1321–26Sleep LabResearch Device
[112]PPG + ACCHRV + MovementWake-REM-Light-Deep65–89%0.54EEG3514–40HomeFitbit Charge 2 [63]
[113]Radar sensorsRespiratory movementsWake-REM-Light-Deep62.70%0.46PSG923–27HomeCircadia C100 [114]
[115]ACCMovementWake-Sleep90.30%0.54PSG41>13Sleep LabArc
[116]ECGRRI + RRWake-NREM-REM76.50%0.49PSG25 (SDB)>18HospitalUnknown
[117]PPG + ACCHR + MovementWake-N1-N2-N3-REM54.20%-PSG18-Sleep LabBasis B1
[118]ACCMovementWake-Sleep87.70%0.6EEG40avg. 26.7HomeMotion Logger [98]
[119]ACC + PPGHR + MovementWake-Sleep89.90%0.42PSG835–50Sleep LabApple Watch + Oura [96,120]
[121]PPGOximetryWake-N1-N2-N3-REM64.1%0.51PSG894 (OSA)44–66HospitalXpod 3011 [122]
[123]PPGPPIWake-Sleep81.10%0.52PSG10 (SDB)43–75HospitalUnknown
[124]ECG + ACCHRV + MovementWake-N1-(N2+N3)-REM75.9%0.6PSG389 (Multiple)-HospitalSOMNIA Dataset [125]
[126]ACCMovementWake-Sleep84.7%0.45PSG4345–84Sleep LabMESA Sleep Dataset [86]
[127]ECGHRWake-REM-Light-Deep77%0.66PSG--Sleep LabMESA Sleep + SHHS [55,86]
[128]ECG + ACCHRV + MovementWake-REM-Light-Deep66.9%0.51PSG2020–37HomeFirstbeat [129]
[130]ACCMovementWake-Sleep90.3%-PSG818–35Research LabZulu Watch
[131]ACCFinger MovementWake-Sleep85.0%-PSG25-Sleep LabTHIM [132]
PSG—Polysomnography, EEG—Electroencephalography, EOG—Electrooculography, EMG—Electromyography, ECG—Electrocardiography, ACC—Accelerometer, RIP—Respiratory Inductive Plethysmography, PPG—Photoplethysmography, EDA—Electrodermal Activity, REM—Rapid Eye Movement, NREM—Non-Rapid Eye Movement, HBI—Heart Beat Interval, RRI—R-R Interval, PPI—Pulse Peak Interval, HRV—Heart Rate Variability, PRV—Pulse Rate Variability, RRV—Respiratory Rate Variability, HR—Heart Rate, RR—Respiratory Rate, PTT—Pulse Transit Time, EDR—ECG-Derived Respiration, OSA—Obstructive Sleep Apnoea, SDB—Sleep Disordered Breathing, PLMS—Periodic Limb Movements of Sleep.
Table 2. Number of articles using each sensing modality as a single data source and in combination with other sensors.
Table 2. Number of articles using each sensing modality as a single data source and in combination with other sensors.
Sensing ModalityAs Single SourceIn Combination
EEG194
ECG108
EMG02
EOG01
Accelerometer1521
RIP13
Radar sensors40
Pressure sensors40
Audio21
PPG416
Nasal airflow11
Sonar sensors10
EDA10
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Imtiaz, S.A. A Systematic Review of Sensing Technologies for Wearable Sleep Staging. Sensors 2021, 21, 1562. https://doi.org/10.3390/s21051562

AMA Style

Imtiaz SA. A Systematic Review of Sensing Technologies for Wearable Sleep Staging. Sensors. 2021; 21(5):1562. https://doi.org/10.3390/s21051562

Chicago/Turabian Style

Imtiaz, Syed Anas. 2021. "A Systematic Review of Sensing Technologies for Wearable Sleep Staging" Sensors 21, no. 5: 1562. https://doi.org/10.3390/s21051562

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop