Inicio  /  Computation  /  Vol: 11 Par: 10 (2023)  /  Artículo
ARTÍCULO
TITULO

A Robust Deep Learning Approach for Accurate Segmentation of Cytoplasm and Nucleus in Noisy Pap Smear Images

Nahida Nazir    
Abid Sarwar    
Baljit Singh Saini and Rafeeya Shams    

Resumen

Cervical cancer poses a significant global health burden, affecting women worldwide. Timely and accurate detection is crucial for effective treatment and improved patient outcomes. The Pap smear test has long been a standard cytology screening method, enabling early cancer diagnosis. However, to enhance quantitative analysis and refine diagnostic capabilities, precise segmentation of the cervical cytoplasm and nucleus using deep learning techniques holds immense promise. This research focuses on addressing the primary challenge of achieving accurate segmentation in the presence of noisy data commonly encountered in Pap smear images. Poisson noise, a prevalent type of noise, corrupts these images, impairing the precise delineation of the cytoplasm and nucleus. Consequently, segmentation boundaries become indistinct, leading to compromised overall accuracy. To overcome these limitations, the utilization of U-Net, a deep learning architecture specifically designed for automatic segmentation, has been proposed. This approach aims to mitigate the adverse effects of Poisson noise on the digitized Pap smear slides. The evaluation of the proposed methodology involved a dataset of 110 Pap smear slides. The experimental results demonstrate that the proposed approach successfully achieves precise segmentation of the nucleus and cytoplasm in noise-free images. By preserving the boundaries of both cellular components, the method facilitates accurate feature extraction, thus contributing to improved diagnostic capabilities. Comparative analysis between noisy and noise-free images reveals the superiority of the presented approach in terms of segmentation accuracy, as measured by various metrics, including the Dice coefficient, specificity, sensitivity, and intersection over union (IoU). The findings of this study underline the potential of deep-learning-based segmentation techniques to enhance cervical cancer diagnosis and pave the way for improved quantitative analysis in this critical field of women?s health.

 Artículos similares

       
 
Lei Li, Xiaobao Zeng, Xinpeng Pan, Ling Peng, Yuyang Tan and Jianxin Liu    
Microseismic monitoring plays an essential role for reservoir characterization and earthquake disaster monitoring and early warning. The accuracy of the subsurface velocity model directly affects the precision of event localization and subsequent process... ver más
Revista: Applied Sciences

 
Woonghee Lee and Younghoon Kim    
This study introduces a deep-learning-based framework for detecting adversarial attacks in CT image segmentation within medical imaging. The proposed methodology includes analyzing features from various layers, particularly focusing on the first layer, a... ver más
Revista: Applied Sciences

 
Adil Redaoui, Amina Belalia and Kamel Belloulata    
Deep network-based hashing has gained significant popularity in recent years, particularly in the field of image retrieval. However, most existing methods only focus on extracting semantic information from the final layer, disregarding valuable structura... ver más
Revista: Information

 
Thomas Kopalidis, Vassilios Solachidis, Nicholas Vretos and Petros Daras    
Recent technological developments have enabled computers to identify and categorize facial expressions to determine a person?s emotional state in an image or a video. This process, called ?Facial Expression Recognition (FER)?, has become one of the most ... ver más
Revista: Information

 
Woonghee Lee, Mingeon Ju, Yura Sim, Young Kul Jung, Tae Hyung Kim and Younghoon Kim    
Deep learning-based segmentation models have made a profound impact on medical procedures, with U-Net based computed tomography (CT) segmentation models exhibiting remarkable performance. Yet, even with these advances, these models are found to be vulner... ver más
Revista: Applied Sciences