Sleep analysis in animal models typically involves recording an electroencephalogram (EEG) and electromyogram (EMG) and scoring vigilance state in brief epochs of data as Wake, REM (rapid eye movement sleep) or NREM (non-REM) either manually or using a computer algorithm. Computerized methods usually estimate features from each epoch like the spectral power associated with distinctive cortical rhythms and dissect the feature space into regions associated with different states by applying thresholds, or by using supervised/unsupervised statistical classifiers; but there are some factors to consider when using them:

  • Most classifiers require scored sample data, elaborate heuristics or computational steps not easily reproduced by the average sleep researcher, who is the targeted end user.
  • Even when prediction is reasonably accurate, small errors can lead to large discrepancies in estimates of important sleep metrics such as the number of bouts or their duration.
  • As we show here, besides partitioning the feature space by vigilance state, modeling transitions between the states can give more accurate scores and metrics.

An unsupervised sleep segmentation framework, “SegWay”, is demonstrated by applying the algorithm step-by-step to unlabeled EEG recordings in mice. The accuracy of sleep scoring and estimation of sleep metrics is validated against manual scores.

Document Type


Publication Date


Notes/Citation Information

Published in MethodsX, v. 3, p.144-155.

© 2016 The Authors.

This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).

Digital Object Identifier (DOI)


Funding Information

This research was supported by a grant (NS083218) from the National Institute of Neurological Disorders and Stroke, U.S.A.

Related Content

A Matlab file segway_sleep.m, and sample Light and Dark feature data (segway_sample_data.mat), are available as supplementary material for readers who wish to use the methodology described here for their own purposes. The authors request that users cite this paper when using this material.

1-s2.0-S2215016116000108-mmc1.m (11 kB)

1-s2.0-S2215016116000108-mmc2.mat (482 kB)