The World Health Organization (WHO) called for a global fight against cervical cancer. There are an estimated 569,000 new cases and 310,000 deaths annually. Searching for practical approaches to deal with cervical cancer screening and treatment has been an urgent research subject. One solution could be to use label-free two-photon excited fluorescence (TPEF) imaging to address this need. The colposcopy-guided biopsy method is being used for cervical precancer detection relying primarily on morphological and organization cell and tissue feature changes. However, the overall performance of colposcopy and biopsy remains unsatisfactory. Label-free TPEF provides images with high morphological and functional (metabolic) content and could lead to enhanced detection of cervical pre-cancers. This paper uses the cell texture and morphology features to classify stacks of such TPEF images acquired from freshly excised healthy and pre-cancerous human cervical tissues. Herein, an automated denoising algorithm and a parametrized edge enhancement method is used for pre-processing the images in the stack. The computer simulations performed on the privately available dataset of 10 healthy stacks, 53 precancer stacks, and the recall and specificity of 100 %, respectively, were observed for both texture and morphology features. However, the dataset used to acquire these results is small. The presented model can be used as a base model for further research and analysis of a larger data set to identify early cervical cancerous changes and potentially significantly improve diagnosis and treatment.
The Coronavirus (Covid-19) pandemic has been affecting the health of people around the globe. With the number of confirmed cases and deaths still rising daily, it is now crucial to quickly detect the positive cases and provide them with the necessary treatment. Presently, several research investigations are being conducted to help control the spread of this epidemic. One research topic is to create faster and more accurate detection. Recent studies have demonstrated that chest CT images encompass the distinctive COVID-19 features, which can be utilized for achieving an efficient COVID-19 diagnosis. However, manually reading these images on a large scale can be laborious and is intractable. Thus, using an artificial intelligence-based system that can help capture the precise information and give an accurate diagnosis would be beneficial. In this paper, a customized weighted filter-based CNN (CCNN) is proposed. Computer simulations show that the proposed CCNN system (1) increases the effectiveness of detect COVID-19 CT scans from the non- COVID-19 CT scans and (2) has faster training time compared to the traditional deep learning models.
According to the American cancer society, the average risk of women getting diagnosed with breast cancer during their life is 13%. The World Health Organization also reports that the number of cancer cases is projected to rise to 19.3 million by 2025. Recent research works point out that physicians can only diagnose cancer with 79% accuracy while machine learning procedures achieve 91% accuracy or more. The current challenges are early cancer detection and the efficient and accurate diagnosis of histopathology tissue samples. Several Deep Learning breast cancer classification models have been developed to assist medical practitioners. However, these methods are data hungry and require thousands of training image samples, often coupled with data augmentation to achieve satisfactory results with long training hours. In this paper, we propose a machine learning classification model by integrating the Parameter free Thresholding Adjacency Statistics (PFTAS) with Fibonacci-p patterns for breast cancer detection. Computer simulations on BreakHis cancer datasets in comparison to other machine learning and deep learning-based methods show that (i) the presented method helps eliminate dependence on large training data and data augmentation, (ii) robustness to noise and background stains, and (iii) lightweight model easy to train and deploy.
The goal of this paper is to (a) test the nuclei based Computer Aided Cancer Detection system using Human Visual based system on the histopathology images and (b) Compare the results of the proposed system with the Local Binary Pattern and modified Fibonacci -p pattern systems. The system performance is evaluated using different parameters such as accuracy, specificity, sensitivity, positive predictive value, and negative predictive value on 251 prostate histopathology images. The accuracy of 96.69% was observed for cancer detection using the proposed human visual based system compared to 87.42% and 94.70% observed for Local Binary patterns and the modified Fibonacci p patterns.
The goals of this paper are (1) test the Computer Aided Classification of the prostate cancer histopathology images based on the Bag-of-Words (BoW) approach (2) evaluate the performance of the classification grade 3 and 4 of the proposed method using the results of the approach proposed by the authors Khurd et al. in [9] and (3) classify the different grades of cancer namely, grade 0, 3, 4, and 5 using the proposed approach. The system performance is assessed using 132 prostate cancer histopathology of different grades. The system performance of the SURF features are also analyzed by comparing the results with SIFT features using different cluster sizes. The results show 90.15% accuracy in detection of prostate cancer images using SURF features with 75 clusters for k-mean clustering. The results showed higher sensitivity for SURF based BoW classification compared to SIFT based BoW.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.