Distribution shifts across subjects, datasets, and recording devices pose a major challenge for deep learning models applied to physiological signals such as EEG. While standard normalization layers (BatchNorm, LayerNorm, InstanceNorm) improve training stability, they ignore temporal correlations and spectral structure.
We introduce PSDNorm, a normalization layer that aligns the power spectral density (PSD) of feature maps to a running Riemannian barycenter. By leveraging temporal context and optimal transport, PSDNorm improves robustness, generalization, and data efficiency.
Experiments on large-scale sleep staging benchmarks (10 datasets, 10K subjects) show that PSDNorm consistently outperforms existing normalization methods, especially under domain shift and limited-data regimes.
PSDNorm replaces classical normalization layers by explicitly modeling temporal autocorrelations. For each feature map, PSDNorm:
The filter size f controls the extent of temporal context and allows PSDNorm to interpolate between InstanceNorm (f=1) and stronger spectral alignment.
PSDNorm is testing over 10 large-scale sleep staging datasets, covering 10K subjects and multiple recording devices. We evaluate robustness under domain shift, generalization to unseen datasets, and data efficiency in low-data regimes.
PSDNorm consistently improves score for different architectures (USleep and CNN-Transformers) compared to existing normalization layers (BatchNorm, LayerNorm, InstanceNorm). This shows the importance of aligning temporal correlations and spectral structure in signals.
A important feature for medical applications is good performance for any subjects. The scatter plot below shows that PSDNorm improves performance for most subjects, and especially for those with low performance under BatchNorm, which are clinically the most important to improve.
@inproceedings{gnassounou2026psdnorm,
title = {PSDNorm: Temporal Normalization for Deep Learning in Sleep Staging},
author = {Gnassounou, Théo and Collas, Antoine and Flamary, Rémi and Gramfort, Alexandre},
booktitle = {International Conference on Learning Representations},
year = {2026}
}