inner-banner-bg

Engineering: Open Access(EOA)

ISSN: 2993-8643 | DOI: 10.33140/EOA

Impact Factor: 1.4

Transformer CS-EEG: A Transformer-Based Deep Learning Framework for Compressed Sensing of EEG Signals

Abstract

Zhiying Xu and Huotao Gao

Electroencephalogram (EEG) signals present significant challenges for efficient acquisition and processing due to their high dimensionality and complex temporal-spatial patterns. This paper presents Transformer CS-EEG, a novel Transformer-based deep learning framework for compressed sensing of EEG signals. Our approach leverages the self-attention mechanism to effectively capture long-range dependencies and complex spatial-temporal correlations in EEG data. We introduce three key innovations: (1) a specialized EEG sampling encoder with adaptive learning of optimal sampling patterns and channel-wise attention mechanisms, (2) frequency-aware multi-head self-attention specifically tailored for EEG characteristics that targets different neurophysiological frequency bands, and (3) a multiscale reconstruction decoder that progressively recovers signal details through hierarchical upsampling. Extensive experiments on three public EEG datasets (CHB-MIT for seizure detection, BCI Competition IV for motor imagery, and DEAP for emotion recognition) demonstrate that our approach consistently outperforms state-of-the-art methods across various compression ratios, achieving up to 18% lower reconstruction error and 2.4 dB higher signal-to-noise ratio at 10× compression. Furthermore, our method preserves essential neurophysiological information, maintaining over 96% of the original accuracy in downstream EEG analysis tasks. The proposed framework offers a promising solution for efficient EEG data acquisition in resource-constrained environments while ensuring high-quality signal reconstruction for accurate clinical interpretation and brain-computer interface applications.

PDF