Self-attention based dual pathway network for spine segmentation in X-ray image(PDF)
《中国医学物理学杂志》[ISSN:1005-202X/CN:44-1351/R]
- Issue:
- 2022年第11期
- Page:
- 1385-1392
- Research Field:
- 医学影像物理
- Publishing date:
Info
- Title:
- Self-attention based dual pathway network for spine segmentation in X-ray image
- Author(s):
- SHI Wenbo1; YANG Huan1; XI Yongming2; DUAN Wenyu1; XU Tongshuai2; DU Yukun2
- 1. College of Computer Science and Technology, Qingdao University, Qingdao 266071, China 2. Department of Spinal Surgery, Laoshan Branch, the Affiliated Hospital of Qingdao University, Qingdao 266000, China
- Keywords:
- Keywords: spine image segmentation U-Net semantic segmentation dual pathway network self-attention mechanism
- PACS:
- R318;R816.8
- DOI:
- DOI:10.3969/j.issn.1005-202X.2022.11.011
- Abstract:
- The segmentation of X-ray images of the entire spine including the spine, sacrum and iliac bone is the essential step for the intelligent diagnosis of spine diseases. A semantic segmentation network named dual pathway with self-attention for refined U-Net (DAU-Net) is proposed to solve the problem of poor accuracy of U-Net semantic segmentation algorithm in multi-region segmentation in full-spine X-ray image. DAU-Net adopted spatial pathway and semantic pathway to learn spatial information and semantic information separately, and then combines these two types of features at the decoder, thereby obtaining more accurate segmentation boundaries in full-spine X-ray images. In the spatial pathway, dilated convolutions and residual blocks are used to expand the receptive field and capture the long-range dependency feature information. Furthermore, the self-attention mechanism is applied in the semantic pathway, and different self-attention encoders and self-attention decoders are designed to construct global association to achieve the semantic segmentation of multiple target bone regions. The experimental results show that DAU-Net can effectively improve the segmentation accuracy in full-spine X-ray images, and its Dice coeffcience is 4.00%, 1.90%, 4.60% and 1.19% higher than those of U-Net, ResU-Net, Attention U-Net and U-Net++, respectively.
Last Update: 2022-11-25