Inicio  /  Computation  /  Vol: 11 Par: 10 (2023)  /  Artículo
ARTÍCULO
TITULO

A Robust Deep Learning Approach for Accurate Segmentation of Cytoplasm and Nucleus in Noisy Pap Smear Images

Nahida Nazir    
Abid Sarwar    
Baljit Singh Saini and Rafeeya Shams    

Resumen

Cervical cancer poses a significant global health burden, affecting women worldwide. Timely and accurate detection is crucial for effective treatment and improved patient outcomes. The Pap smear test has long been a standard cytology screening method, enabling early cancer diagnosis. However, to enhance quantitative analysis and refine diagnostic capabilities, precise segmentation of the cervical cytoplasm and nucleus using deep learning techniques holds immense promise. This research focuses on addressing the primary challenge of achieving accurate segmentation in the presence of noisy data commonly encountered in Pap smear images. Poisson noise, a prevalent type of noise, corrupts these images, impairing the precise delineation of the cytoplasm and nucleus. Consequently, segmentation boundaries become indistinct, leading to compromised overall accuracy. To overcome these limitations, the utilization of U-Net, a deep learning architecture specifically designed for automatic segmentation, has been proposed. This approach aims to mitigate the adverse effects of Poisson noise on the digitized Pap smear slides. The evaluation of the proposed methodology involved a dataset of 110 Pap smear slides. The experimental results demonstrate that the proposed approach successfully achieves precise segmentation of the nucleus and cytoplasm in noise-free images. By preserving the boundaries of both cellular components, the method facilitates accurate feature extraction, thus contributing to improved diagnostic capabilities. Comparative analysis between noisy and noise-free images reveals the superiority of the presented approach in terms of segmentation accuracy, as measured by various metrics, including the Dice coefficient, specificity, sensitivity, and intersection over union (IoU). The findings of this study underline the potential of deep-learning-based segmentation techniques to enhance cervical cancer diagnosis and pave the way for improved quantitative analysis in this critical field of women?s health.

 Artículos similares

       
 
Abdelghani Azri, Adil Haddi and Hakim Allali    
Collaborative filtering (CF), a fundamental technique in personalized Recommender Systems, operates by leveraging user?item preference interactions. Matrix factorization remains one of the most prevalent CF-based methods. However, recent advancements in ... ver más
Revista: Information

 
Majdi Sukkar, Madhu Shukla, Dinesh Kumar, Vassilis C. Gerogiannis, Andreas Kanavos and Biswaranjan Acharya    
Effective collision risk reduction in autonomous vehicles relies on robust and straightforward pedestrian tracking. Challenges posed by occlusion and switching scenarios significantly impede the reliability of pedestrian tracking. In the current study, w... ver más
Revista: Information

 
Shengkun Gu and Dejiang Wang    
Within the domain of architectural urban informatization, the automated precision recognition of two-dimensional paper schematics emerges as a pivotal technical challenge. Recognition methods traditionally employed frequently encounter limitations due to... ver más
Revista: Information

 
Adil Redaoui, Amina Belalia and Kamel Belloulata    
Deep network-based hashing has gained significant popularity in recent years, particularly in the field of image retrieval. However, most existing methods only focus on extracting semantic information from the final layer, disregarding valuable structura... ver más
Revista: Information

 
Thomas Kopalidis, Vassilios Solachidis, Nicholas Vretos and Petros Daras    
Recent technological developments have enabled computers to identify and categorize facial expressions to determine a person?s emotional state in an image or a video. This process, called ?Facial Expression Recognition (FER)?, has become one of the most ... ver más
Revista: Information