[1]张晓莉,张喜珍,林冬梅,等.基于CNN-BiGRU和多头自注意力机制的自动睡眠分期方法[J].中国医学物理学杂志,2025,42(4):496-504.[doi:10.3969/j.issn.1005-202X.2025.04.011]
 ZHANG Xiaoli,ZHANG Xizhen,et al.Automatic sleep staging method based on CNN-BiGRU and multi-head self-attentionmechanism[J].Chinese Journal of Medical Physics,2025,42(4):496-504.[doi:10.3969/j.issn.1005-202X.2025.04.011]
点击复制

基于CNN-BiGRU和多头自注意力机制的自动睡眠分期方法()
分享到:

《中国医学物理学杂志》[ISSN:1005-202X/CN:44-1351/R]

卷:
42
期数:
2025年第4期
页码:
496-504
栏目:
医学信号处理与医学仪器
出版日期:
2025-04-20

文章信息/Info

Title:
Automatic sleep staging method based on CNN-BiGRU and multi-head self-attentionmechanism
文章编号:
1005-202X(2025)04-0496-09
作者:
张晓莉 12张喜珍 12林冬梅 3陈扶明 2
1.甘肃中医药大学医学信息工程学院,甘肃 兰州 730000;2.中国人民解放军联勤保障部队第九四〇医院医疗保障中心,甘肃 兰州 730050;3.兰州理工大学电气与信息工程学院,甘肃 兰州730050
Author(s):
ZHANG Xiaoli1 2 ZHANG Xizhen1 2 LIN Dongmei3 CHEN Fuming2
1. School of Medical Information Engineering, Gansu University of Chinese Medicine, Lanzhou 730000, China; 2. Medical SecurityCenter, the 940th Hospital of Joint Logistics Support Force of Chinese People’s Liberation Army, Lanzhou 730050, China; 3. School ofElectrical and Information Engineering, Lanzhou University of Technology, Lanzhou 730050, China
关键词:
睡眠分期类平衡残差网络双向门控循环网络
Keywords:
sleep stage class balance residual network bidirectional gated recurrent network
分类号:
R318
DOI:
10.3969/j.issn.1005-202X.2025.04.011
文献标志码:
A
摘要:
针对睡眠脑电数据存在类别不平衡以及深层网络在提取更多特征时可能出现的梯度消失或爆炸现象问题。本文首先通过改进的自适应合成采样技术对少样本类别的睡眠脑电数据进行数据增强。然后,利用卷积神经网络和残差网络学习数据特征,并通过三层双向门控循环网络挖掘深层时序信息,确定各睡眠阶段之间的相关性,实现自动特征学习和睡眠周期提取。最后,采用多头自注意力机制增强模型对序列中重要部分的关注,完成各睡眠阶段的分类。实验结果表明,根据AASM睡眠分期标准,在经过数据类平衡处理后,基于CNN-BiGRU和Multi-head Self Attention自动睡眠分期模型在 Sleep-EDF-20 数据集上的总准确率和 Kappa系数分别达到了 90.77% 和 0.88,N1 期的精确率达到了 87.1%;在Sleep-EDFx数据集上MF1为0.84,N1期的精确率也达到了77.2%,总体较CNN-BiGRU自动睡眠分期模型在原始数据集上的性能有所提升,与其他相关研究相比,睡眠阶段分类精度显著提升。验证了本文方法的有效性和泛化能力。
Abstract:
The study aims to address the issues of class imbalance in sleep EEG data and gradient vanishing or explosionphenomena that may occur when deep networks extract more features. An improved adaptive synthetic sampling technique isfirstly employed to perform data augmentation on the minority classes of sleep EEG data. Subsequently, convolutional neuralnetworks and residual networks are utilized to learn data features, while a 3-layer bidirectional gated recurrent network isapplied to explore deep temporal information and establish correlations between different sleep stages, enabling automaticfeature learning and sleep cycle extraction. Finally, a multi-head self-attention mechanism is adopted to enhance the model’sfocus on critical parts of the sequence, thereby completing the classification of various sleep stages. Experimental resultsshow that according to the AASM sleep staging criteria, the automatic sleep staging model integrating CNN-BiGRU andmulti-head self attention achieves an overall accuracy of 90.77% and a Kappa coefficient of 0.88 on the Sleep-EDF-20dataset after data class balancing, with the precision of N1 stage reaching 87.1%. On the Sleep-EDFx dataset, the modelattains an MF1 score of 0.84 while maintaining a precision of 77.2% for N1 stage classification. These metrics demonstratesignificant improvements in performance as compared with CNN-BiGRU model tested on the original dataset. Whenbenchmarked against other related studies, the proposed architecture exhibits superior sleep stage classification accuracy.These findings collectively validate the effectiveness and generalization capability of the proposed method.

相似文献/References:

[1]熊馨,吴迪,张亚茹,等.基于改进人工蜂群优化支持向量机的睡眠分期[J].中国医学物理学杂志,2023,40(4):440.[doi:DOI:10.3969/j.issn.1005-202X.2023.04.008]
 XIONG Xin,WU Di,ZHANG Yaru,et al.Sleep staging using support vector machine optimized by improved artificial bee colony[J].Chinese Journal of Medical Physics,2023,40(4):440.[doi:DOI:10.3969/j.issn.1005-202X.2023.04.008]

备注/Memo

备注/Memo:
【收稿日期】2024-09-12【基金项目】国家自然科学基金(61901515, 62361038);甘肃省自然科学基金(22JR5RA002);联勤保障部队第九四〇医院院内项目(2023YXKY018)【作者简介】张晓莉,硕士研究生,研究方向:医学信号检测与处理,E-mail: zxlxr86@163.com【通信作者】陈扶明,博士,高级工程师,硕士研究生导师,研究方向:医学信号检测与处理,E-mail: cfm5762@126.com
更新日期/Last Update: 2025-04-30