Open Access
Open access
номер публикации moor.2024.0479

Developing Lagrangian-Based Methods for Nonsmooth Nonconvex Optimization

Тип публикацииJournal Article
Дата публикации2026-01-08
scimago Q1
wos Q1
white level БС2
SJR1.349
CiteScore4
Impact factor1.9
ISSN0364765X, 15265471
Краткое описание

In this paper, we consider the minimization of a nonsmooth nonconvex objective function [Formula: see text] over a closed convex subset [Formula: see text] of [Formula: see text], with additional nonsmooth nonconvex constraints [Formula: see text]. We develop a unified framework for developing Lagrangian-based methods, which takes a single-step update to the primal variables by some subgradient methods in each iteration. These subgradient methods are “embedded” into our framework in the sense that they are incorporated as black-box updates to the primal variables. We prove that our proposed framework inherits the global convergence guarantees from these embedded subgradient methods under mild conditions. In addition, we show that our framework can be extended to solve constrained optimization problems with expectation constraints. Based on the proposed framework, we show that a wide range of existing stochastic subgradient methods, including proximal stochastic subgradient descent (SGD), proximal momentum SGD, and proximal adaptive moment estimation method (ADAM), can be embedded into Lagrangian-based methods. Preliminary numerical experiments on deep learning tasks illustrate that our proposed framework yields efficient variants of Lagrangian-based methods with convergence guarantees for nonsmooth nonconvex constrained optimization problems.

Funding: The research of X. Hu was supported by the National Natural Science Foundation of China [Grant 12301408]. The research of K.-C. Toh was supported by the Ministry of Education—Singapore [Grant MOE-T2EP20224-0017].

Вы ученый?

Создайте профиль, чтобы получать персональные рекомендации коллег, конференций и новых статей.
Метрики
0
Поделиться
Цитировать
ГОСТ |
Цитировать
Xiao N. et al. Developing Lagrangian-Based Methods for Nonsmooth Nonconvex Optimization // Mathematics of Operations Research. 2026. moor.2024.0479
ГОСТ со всеми авторами (до 50) Скопировать
Xiao N., Ding K., Hu X., Toh K. Developing Lagrangian-Based Methods for Nonsmooth Nonconvex Optimization // Mathematics of Operations Research. 2026. moor.2024.0479
RIS |
Цитировать
TY - JOUR
DO - 10.1287/moor.2024.0479
UR - https://pubsonline.informs.org/doi/10.1287/moor.2024.0479
TI - Developing Lagrangian-Based Methods for Nonsmooth Nonconvex Optimization
T2 - Mathematics of Operations Research
AU - Xiao, Nachuan
AU - Ding, Kuangyu
AU - Hu, Xiao-Yin
AU - Toh, Kim-Chuan
PY - 2026
DA - 2026/01/08
PB - Institute for Operations Research and the Management Sciences (INFORMS)
SN - 0364-765X
SN - 1526-5471
ER -
BibTex
Цитировать
BibTex (до 50 авторов) Скопировать
@article{2026_Xiao,
author = {Nachuan Xiao and Kuangyu Ding and Xiao-Yin Hu and Kim-Chuan Toh},
title = {Developing Lagrangian-Based Methods for Nonsmooth Nonconvex Optimization},
journal = {Mathematics of Operations Research},
year = {2026},
publisher = {Institute for Operations Research and the Management Sciences (INFORMS)},
month = {jan},
url = {https://pubsonline.informs.org/doi/10.1287/moor.2024.0479},
pages = {moor.2024.0479},
doi = {10.1287/moor.2024.0479}
}
Ошибка в публикации?