Investigator

Muhammad Attique Khan

Assistant Professor/ Dr · Prince Mohammad bin Fahd University, Artificial Intelligence

About

Research Interests

MAKMuhammad Attique …
Papers(1)
A Novel Network-Level…
Collaborators(1)
Yunyoung Nam
Institutions(2)
Prince Mohammad Bin F…Soonchunhyang Univers…

Papers

A Novel Network-Level Fused Self-Attention Deep Neural Network for Cervical Cancer Classification from Cervicography Images

Introduction cervical cancer ranks as the fourth most common cancer among females worldwide. Approximately 528,000 new cases of cervical cancer are reported annually, and about 85% of them occur in less-developed countries. The lack of skilled medical staff and pre-screening procedures is the main cause of the high fatality rate in these countries. Cervicography images are the gold standard procedure for the evaluation of cervical cancer; however, the high intra-class inconsistency makes the diagnosis process more challenging for skilled medical specialists. Method In this work, we propose a fully automated computer-aided diagnosis (CAD) system for classifying cervical cancer using Cervicography images. Data augmentation is performed in the initial phase to address dataset imbalance. Subsequently, we proposed two novel deep learning modules: the 11-Parallel Inverted Residual Bottleneck Blocks (11-PIRBnet) architecture and the 9-Parallel Inverted Residual blocks with Self-Attention Mechanism (9-PIRSANet). Both modules are fused at the network level via a depth concatenation layer to form a new network, 375NFNet. The proposed network is trained on the selected dataset, whereas the hyperparameters are initialized through Bayesian Optimization (BO). For feature extraction, a depth concatenation layer is used during testing to combine information from both deep learning modules. Finally, the extracted features are classified using a shallow neural network (SNN) to produce the final classification. Result To evaluate the model, experiments were conducted on a publicly available cervical screening dataset of Cervicography images, and results demonstrate an accuracy of 95.5%, a precision of 95.4%, and an area under the curve of 0.97. When compared with several pre-trained techniques, the proposed architecture achieved significant improvement in accuracy, precision, and number of trainable parameters. Conclusion The proposed 375NFNet architecture demonstrates remarkable accuracy and efficiency in classifying cervical cancer through cervicography images, which shows its potential as a valuable tool in resource-constrained environments.

75Works
1Papers
1Collaborators
Uterine Cervical NeoplasmsDiagnosis, Computer-Assisted

Positions

2024–

Assistant Professor/ Dr

Prince Mohammad bin Fahd University · Artificial Intelligence