Convergence analysis of descent optimization algorithms under Polyak-Lojasiewicz- Kurdyka conditions

Carregando...
Imagem de Miniatura

Título da Revista

ISSN da Revista

Título de Volume

Editor

Universidade Federal de Goiás

Resumo

This thesis presents a comprehensive convergence analysis of generic classes of descent algorithms in nonsmooth and nonconvex optimization under the Polyak-Lojasiewicz-Kurdyka (PLK) property. In particular, we revisit and extend the results on convergence rates presented by Khanh, Mordukhovich, and Tran (J. Optim. Theory Appl., 2023), refining the understanding of the zero exponent in smooth PLK functions and broadening the discussion on the inconsistency between the lower exponent PLK property and the Lipschitz continuity of gradients to more general settings. Among other contributions, we establish the finite termination of generic algorithms under lower exponent PLK conditions. Additionally, we derive new convergence rates for inexact reduced gradient methods and certain variants of the boosted algorithm in DC programming. We present novel results by considering a modified error condition, obtaining either finite or superlinear convergence for the generated sequences. Notably, we reveal that for a broad class of difference programs, the lower exponent PLK conditions are inherently incompatible with the Lipschitz continuity of the gradient of the plus function near a local minimizer. However, we demonstrate that this inconsistency may not hold if Lipschitz continuity is replaced solely by gradient continuity

Descrição

Citação

MOTA, T. S. Convergence analysis of descent optimization algorithms under Polyak-Lojasiewicz- Kurdyka conditions. 2025. 86 f. Tese (Doutorado em Matemática) - Instituto de Matemática e Estatística, Universidade Federal de Goiás, Goiânia, 2025.