Skip to yearly menu bar Skip to main content


Poster

SPDIM: Source-Free Unsupervised Conditional and Label Shift Adaptation in EEG

Shanglin Li · Motoaki Kawanabe · Reinmar Kobler

Hall 3 + Hall 2B #59
[ ] [ Project Page ]
Sat 26 Apr midnight PDT — 2:30 a.m. PDT

Abstract:

The non-stationary nature of electroencephalography (EEG) introduces distribution shifts across domains (e.g., days and subjects), posing a significant challenge to EEG-based neurotechnology generalization.Without labeled calibration data for target domains, the problem is a source-free unsupervised domain adaptation (SFUDA) problem.For scenarios with constant label distribution, Riemannian geometry-aware statistical alignment frameworks on the symmetric positive definite (SPD) manifold are considered state-of-the-art.However, many practical scenarios, including EEG-based sleep staging, exhibit label shifts.Here, we propose a geometric deep learning framework for SFUDA problems under specific distribution shifts, including label shifts.We introduce a novel, realistic generative model and show that prior Riemannian statistical alignment methods on the SPD manifold can compensate for specific marginal and conditional distribution shifts but hurt generalization under label shifts.As a remedy, we propose a parameter-efficient manifold optimization strategy termed SPDIM.SPDIM uses the information maximization principle to learn a single SPD-manifold-constrained parameter per target domain.In simulations, we demonstrate that SPDIM can compensate for the shifts under our generative model.Moreover, using public EEG-based brain-computer interface and sleep staging datasets, we show that SPDIM outperforms prior approaches.

Live content is unavailable. Log in and register to view live content