[1]刘宏滨,顾德.融合Transformer和卷积的结直肠息肉分割算法[J].中国医学物理学杂志,2024,41(3):316-322.[doi:DOI:10.3969/j.issn.1005-202X.2024.03.008]
 LIU Hongbin,GU De.Colorectal polyp segmentation algorithm integrating Transformer and convolution[J].Chinese Journal of Medical Physics,2024,41(3):316-322.[doi:DOI:10.3969/j.issn.1005-202X.2024.03.008]
点击复制

融合Transformer和卷积的结直肠息肉分割算法()
分享到:

《中国医学物理学杂志》[ISSN:1005-202X/CN:44-1351/R]

卷:
41卷
期数:
2024年第3期
页码:
316-322
栏目:
医学影像物理
出版日期:
2024-03-27

文章信息/Info

Title:
Colorectal polyp segmentation algorithm integrating Transformer and convolution
文章编号:
1005-202X(2024)03-0316-07
作者:
刘宏滨顾德
江南大学物联网工程学院, 江苏 无锡 214122
Author(s):
LIU Hongbin GU De
School of Internet of Things Engineering, Jiangnan University, Wuxi 214122, China
关键词:
息肉分割特征融合Transformer卷积
Keywords:
Keywords: polyp segmentation feature fusion Transformer convolution
分类号:
R318;TP183
DOI:
DOI:10.3969/j.issn.1005-202X.2024.03.008
文献标志码:
A
摘要:
结直肠息肉大小不一、形态各异,特别是边界模糊导致难以准确定位,小尺寸息肉容易漏检,使得息肉分割存在较大的挑战。针对以上问题,提出一种融合Transformer和卷积的息肉分割算法。首先使用Transformer提取图像全局特征,保证网络全局建模的能力,提高息肉主体区域和模糊边界的定位能力。然后,引入卷积加强网络对息肉细节的处理能力,细化边界分割效果,提高小尺寸息肉捕获能力。最后,将Transformer和卷积提取的特征进行深度融合,实现特征互补。将该算法在CVC-ClinicDB和Kvasir-SEG数据集上进行实验,其相似性系数分别为95.4%和93.2%,平均交并比分别为91.3%和88.6%。进一步在CVC-ColonDB、CVC-T和ETIS数据集上测试泛化能力,其相似性系数分别为81.3%、90.9%和80.1%。结果表明,所提算法在息肉分割的准确度上有较大的提升。
Abstract:
Abstract: In response to the challenges of varied sizes and diverse shapes of colorectal polyps, especially with blurred boundaries that often complicates localization and smaller polyps being particularly prone to oversight, a colorectal polyp segmentation algorithm integrating Transformer and convolution is proposed. Transformer is employed to extract global features from images for ensuring the networks capability for global modeling and improving the localization capability for both main polyp regions and vague boundaries. Subsequently, convolution is introduced to augment the networks ability to process polyp details, refining boundary segmentation and enhancing the capture capability for small-sized polyps. Finally, a deep fusion of the features extracted by Transformer and convolution is carried out to realize feature complementarity. The experimental evaluation using CVC-ClinicDB and Kvasir-SEG datasets show that the algorithm has similarity coefficients of 95.4% and 93.2%, and mean intersection over union of 91.3% and 88.6%, respectively. Further tests on the generalization capability of the algorithm are conducted on CVC-ColonDB, CVC-T, and ETIS datasets, in which similarity coefficients of 81.3%, 90.9% and 80.1% are obtained. The results indicate a notable improvement in the accuracy of polyp segmentation achieved by the proposed algorithm.

相似文献/References:

[1]吴迪,胡胜,刘伟峰,等. 基于特征融合视觉显著性的医学图像分割[J].中国医学物理学杂志,2018,35(6):670.[doi:DOI:10.3969/j.issn.1005-202X.2018.06.010]
 WU Di,HU Sheng,LIU Weifeng,et al. Medical image segmentation based on visual saliency of feature fusion[J].Chinese Journal of Medical Physics,2018,35(3):670.[doi:DOI:10.3969/j.issn.1005-202X.2018.06.010]
[2]姜月,邹任玲. 基于多特征融合的运动想象脑电信号识别研究[J].中国医学物理学杂志,2019,36(5):590.[doi:DOI:10.3969/j.issn.1005-202X.2019.05.019]
 JIANG Yue,ZOU Renling. Recognition of motor imagery EEG signals based on multi-feature fusion[J].Chinese Journal of Medical Physics,2019,36(3):590.[doi:DOI:10.3969/j.issn.1005-202X.2019.05.019]
[3]付常洋,王瑜,肖洪兵,等.基于多尺度功能脑网络融合特征的抑郁症分类算法[J].中国医学物理学杂志,2020,37(4):439.[doi:DOI:10.3969/j.issn.1005-202X.2020.04.008]
 FU Changyang,WANG Yu,XIAO Hongbing,et al.Classification of depression using fusion features based on multi-scale functional brain network[J].Chinese Journal of Medical Physics,2020,37(3):439.[doi:DOI:10.3969/j.issn.1005-202X.2020.04.008]
[4]李雪,周金治,莫春梅,等.基于特征融合的U-Net肺自动分割方法[J].中国医学物理学杂志,2021,38(6):704.[doi:DOI:10.3969/j.issn.1005-202X.2021.06.009]
 LI Xue,ZHOU Jinzhi,et al.U-Net automatic lung segmentation based on feature fusion[J].Chinese Journal of Medical Physics,2021,38(3):704.[doi:DOI:10.3969/j.issn.1005-202X.2021.06.009]
[5]潘子妍,邢素霞,逄键梁,等.基于多特征融合与XGBoost的肺结节检测[J].中国医学物理学杂志,2021,38(11):1371.[doi:DOI:10.3969/j.issn.1005-202X.2021.11.010]
 PAN Ziyan,XING Suxia,PANG Jianliang,et al.Lung nodule detection based on multi-feature fusion and XGBoost[J].Chinese Journal of Medical Physics,2021,38(3):1371.[doi:DOI:10.3969/j.issn.1005-202X.2021.11.010]
[6]李红利,丁满,张荣华,等.基于特征融合神经网络的运动想象脑电分类算法[J].中国医学物理学杂志,2022,39(1):69.[doi:DOI:10.3969/j.issn.1005-202X.2022.01.012]
 LI?ongli,ING Man,HANG?onghua,et al.Motor imagery EEG classification algorithm based on feature fusion neural network[J].Chinese Journal of Medical Physics,2022,39(3):69.[doi:DOI:10.3969/j.issn.1005-202X.2022.01.012]
[7]罗刚,王铭勋,黎明,等.面向情绪脑电分析的增强型功率谱密度特征提取方法[J].中国医学物理学杂志,2022,39(3):349.[doi:DOI:10.3969/j.issn.1005-202X.2022.03.015]
 LUO Gang,WANG Mingxun,LI Ming,et al.Feature extraction method based on enhanced power spectral density for emotion analysis using EEG[J].Chinese Journal of Medical Physics,2022,39(3):349.[doi:DOI:10.3969/j.issn.1005-202X.2022.03.015]
[8]方新林,方艳红,王迪.基于多模态特征融合的脑瘤图像分割方法[J].中国医学物理学杂志,2022,39(6):682.[doi:DOI:10.3969/j.issn.1005-202X.2022.06.005]
 FANG Xinlin,FANG Yanhong,WANG Di.Brain tumor image segmentation method based on multi-modal feature fusion[J].Chinese Journal of Medical Physics,2022,39(3):682.[doi:DOI:10.3969/j.issn.1005-202X.2022.06.005]
[9]王京华,袁金丽,郭志涛,等.改进的YOLOv4算法在肺结核检测中的应用研究[J].中国医学物理学杂志,2023,40(1):113.[doi:DOI:10.3969/j.issn.1005-202X.2023.01.019]
 WANG Jinghua,YUAN Jinli,GUO Zhitao,et al.Application of improved YOLOv4 algorithm in the detection of pulmonary tuberculosis[J].Chinese Journal of Medical Physics,2023,40(3):113.[doi:DOI:10.3969/j.issn.1005-202X.2023.01.019]
[10]洪启帆,玄祖兴,李雅馨.基于全卷积神经网络的低剂量CT去噪算法[J].中国医学物理学杂志,2023,40(6):695.[doi:DOI:10.3969/j.issn.1005-202X.2023.06.005]
 HONG Qifan,XUAN Zuxing,LI Yaxin.Fully convolutional neural network based algorithm for low-dose CT image denoising[J].Chinese Journal of Medical Physics,2023,40(3):695.[doi:DOI:10.3969/j.issn.1005-202X.2023.06.005]

备注/Memo

备注/Memo:
【收稿日期】2023-12-20 【基金项目】江苏省自然科学基金(BK20231036,BK20180594) 【作者简介】刘宏滨,硕士,研究方向:深度学习、医学图像处理,E-mail: 2469607141@qq.com 【通信作者】顾德,博士,副教授,研究方向:无线传感器网络拓扑识别和基于图像的生物医学信息识别,E-mail: gude@jiangnan.edu.cn
更新日期/Last Update: 2024-03-27