том 194 страницы 116154

A comprehensive survey of fractional gradient descent methods and their convergence analysis

Тип публикацииJournal Article
Дата публикации2025-05-01
SCImago Q1
WOS Q1
БС1
SJR1.123
CiteScore9.9
Impact factor5.6
ISSN09600779, 18732887
Краткое описание
Fractional Gradient Descent (FGD) methods extend classical optimization algorithms by integrating fractional calculus, leading to notable improvements in convergence speed, stability, and accuracy. However, recent studies indicate that engineering challenges—such as tensor-based differentiation in deep neural networks—remain partially unresolved, prompting further investigation into the scalability and computational feasibility of FGD. This paper provides a comprehensive review of recent advancements in FGD techniques, focusing on their approximation methods and convergence properties. These methods are systematically categorized based on their strategies to overcome convergence challenges inherent in fractional-order calculations, such as non-locality and long-memory effects. Key techniques examined include modified fractional-order gradients designed to avoid singularities and ensure convergence to the true extremum. Adaptive step-size strategies and variable fractional-order schemes are analyzed, balancing rapid convergence with precise parameter estimation. Additionally, the application of truncation methods is explored to mitigate oscillatory behavior associated with fractional derivatives. By synthesizing convergence analyses from multiple studies, insights are offered into the theoretical foundations of these methods, including proofs of linear convergence. Ultimately, this paper highlights the effectiveness of various FGD approaches in accelerating convergence and enhancing stability. While also acknowledging significant gaps in practical implementations for large-scale engineering tasks, including deep learning. The presented review serves as a resource for researchers and practitioners in the selection of appropriate FGD techniques for different optimization problems.
Для доступа к списку цитирований публикации необходимо авторизоваться.

Топ-30

Журналы

1
2
Mathematics
2 публикации, 16.67%
Neurocomputing
2 публикации, 16.67%
Energy and Buildings
1 публикация, 8.33%
Inverse Problems
1 публикация, 8.33%
Scientific Reports
1 публикация, 8.33%
Cluster Computing
1 публикация, 8.33%
Information Processing and Management
1 публикация, 8.33%
Physica Scripta
1 публикация, 8.33%
IFAC-PapersOnLine
1 публикация, 8.33%
Vehicular Communications
1 публикация, 8.33%
1
2

Издатели

1
2
3
4
5
6
Elsevier
6 публикаций, 50%
IOP Publishing
2 публикации, 16.67%
MDPI
2 публикации, 16.67%
Springer Nature
2 публикации, 16.67%
1
2
3
4
5
6
  • Мы не учитываем публикации, у которых нет DOI.
  • Статистика публикаций обновляется еженедельно.

Вы ученый?

Создайте профиль, чтобы получать персональные рекомендации коллег, конференций и новых статей.
 Войти с ORCID
Метрики
12
Поделиться
Цитировать
ГОСТ |
Цитировать
Elnady S. M. et al. A comprehensive survey of fractional gradient descent methods and their convergence analysis // Chaos, Solitons and Fractals. 2025. Vol. 194. p. 116154.
ГОСТ со всеми авторами (до 50) Скопировать
Elnady S. M., El-Beltagy M., Radwan A. G., Fouda M. E. A comprehensive survey of fractional gradient descent methods and their convergence analysis // Chaos, Solitons and Fractals. 2025. Vol. 194. p. 116154.
RIS |
Цитировать
TY - JOUR
DO - 10.1016/j.chaos.2025.116154
UR - https://linkinghub.elsevier.com/retrieve/pii/S0960077925001675
TI - A comprehensive survey of fractional gradient descent methods and their convergence analysis
T2 - Chaos, Solitons and Fractals
AU - Elnady, Sroor M.
AU - El-Beltagy, Mohamed
AU - Radwan, Ahmed G.
AU - Fouda, Mohammed E
PY - 2025
DA - 2025/05/01
PB - Elsevier
SP - 116154
VL - 194
SN - 0960-0779
SN - 1873-2887
ER -
BibTex
Цитировать
BibTex (до 50 авторов) Скопировать
@article{2025_Elnady,
author = {Sroor M. Elnady and Mohamed El-Beltagy and Ahmed G. Radwan and Mohammed E Fouda},
title = {A comprehensive survey of fractional gradient descent methods and their convergence analysis},
journal = {Chaos, Solitons and Fractals},
year = {2025},
volume = {194},
publisher = {Elsevier},
month = {may},
url = {https://linkinghub.elsevier.com/retrieve/pii/S0960077925001675},
pages = {116154},
doi = {10.1016/j.chaos.2025.116154}
}
Ошибка в публикации?