Segmenting the pelvic region accurately and quickly from T2-weighted MR images is beneficial for the treatment of pelvic-related diseases. Nevertheless, there are great challenges in segmenting the pelvic region due to its varying scale range, similar tissue strength, and blurred edges. Although the well-known encoder-decoder convolutional neural network has achieved great success in the field of medical image segmentation, its ordinary convolutional layer is insufficient to adequately extract and convey features. Moreover, this architecture does not make the full utilization of different scale contextual information and sometimes fails to capture long-range dependencies. To address these challenges, this paper proposes a DFM-Net which applies Dense Block as the basic module to extract features and enhance the transfer of features. Aiming at the problem of tissue strength similarity, the non-local operation Feature Similarity Module (FSM) was introduced to capture long-range dependencies of the pelvis structure. To enrich the extraction of global and local context information, a new multi-scale method is applied to optimize the model during the up-sampling process of the decoder. Finally, experiments and evaluations were performed on the T2-weighted MR image data set of the pelvis. The encouraging results show that our proposed method is superior to the other five different methods of segmentation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.