Open Access
Open access
volume 13 issue 5 pages 882

A Novel SHAP-GAN Network for Interpretable Ovarian Cancer Diagnosis

Publication typeJournal Article
Publication date2025-03-06
scimago Q2
wos Q1
SJR0.498
CiteScore4.6
Impact factor2.2
ISSN22277390
Abstract

Ovarian cancer stands out as one of the most formidable adversaries in women’s health, largely due to its typically subtle and nonspecific early symptoms, which pose significant challenges to early detection and diagnosis. Although existing diagnostic methods, such as biomarker testing and imaging, can help with early diagnosis to some extent, these methods still have limitations in sensitivity and accuracy, often leading to misdiagnosis or missed diagnosis. Ovarian cancer’s high heterogeneity and complexity increase diagnostic challenges, especially in disease progression prediction and patient classification. Machine learning (ML) has outperformed traditional methods in cancer detection by processing large datasets to identify patterns missed by conventional techniques. However, existing AI models still struggle with accuracy in handling imbalanced and high-dimensional data, and their “black-box” nature limits clinical interpretability. To address these issues, this study proposes SHAP-GAN, an innovative diagnostic model for ovarian cancer that integrates Shapley Additive exPlanations (SHAP) with Generative Adversarial Networks (GANs). The SHAP module quantifies each biomarker’s contribution to the diagnosis, while the GAN component optimizes medical data generation. This approach tackles three key challenges in medical diagnosis: data scarcity, model interpretability, and diagnostic accuracy. Results show that SHAP-GAN outperforms traditional methods in sensitivity, accuracy, and interpretability, particularly with high-dimensional and imbalanced ovarian cancer datasets. The top three influential features identified are PRR11, CIAO1, and SMPD3, which exhibit wide SHAP value distributions, highlighting their significant impact on model predictions. The SHAP-GAN network has demonstrated an impressive accuracy rate of 99.34% on the ovarian cancer dataset, significantly outperforming baseline algorithms, including Support Vector Machines (SVM), Logistic Regression (LR), and XGBoost. Specifically, SVM achieved an accuracy of 72.78%, LR achieved 86.09%, and XGBoost achieved 96.69%. These results highlight the superior performance of SHAP-GAN in handling high-dimensional and imbalanced datasets. Furthermore, SHAP-GAN significantly alleviates the challenges associated with intricate genetic data analysis, empowering medical professionals to tailor personalized treatment strategies for individual patients.

Found 
Found 

Are you a researcher?

Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
1
Share
Cite this
GOST |
Cite this
GOST Copy
Cai J. et al. A Novel SHAP-GAN Network for Interpretable Ovarian Cancer Diagnosis // Mathematics. 2025. Vol. 13. No. 5. p. 882.
GOST all authors (up to 50) Copy
Cai J., Lee Z. J., Lin Z., Yang M. A Novel SHAP-GAN Network for Interpretable Ovarian Cancer Diagnosis // Mathematics. 2025. Vol. 13. No. 5. p. 882.
RIS |
Cite this
RIS Copy
TY - JOUR
DO - 10.3390/math13050882
UR - https://www.mdpi.com/2227-7390/13/5/882
TI - A Novel SHAP-GAN Network for Interpretable Ovarian Cancer Diagnosis
T2 - Mathematics
AU - Cai, Jingxun
AU - Lee, Zne Jung
AU - Lin, Zhihxian
AU - Yang, Ming-Ren
PY - 2025
DA - 2025/03/06
PB - MDPI
SP - 882
IS - 5
VL - 13
SN - 2227-7390
ER -
BibTex |
Cite this
BibTex (up to 50 authors) Copy
@article{2025_Cai,
author = {Jingxun Cai and Zne Jung Lee and Zhihxian Lin and Ming-Ren Yang},
title = {A Novel SHAP-GAN Network for Interpretable Ovarian Cancer Diagnosis},
journal = {Mathematics},
year = {2025},
volume = {13},
publisher = {MDPI},
month = {mar},
url = {https://www.mdpi.com/2227-7390/13/5/882},
number = {5},
pages = {882},
doi = {10.3390/math13050882}
}
MLA
Cite this
MLA Copy
Cai, Jingxun, et al. “A Novel SHAP-GAN Network for Interpretable Ovarian Cancer Diagnosis.” Mathematics, vol. 13, no. 5, Mar. 2025, p. 882. https://www.mdpi.com/2227-7390/13/5/882.