TY - JOUR
T1 - A proximal algorithm with backtracked extrapolation for a class of structured fractional programming
AU - Li, Qia
AU - Shen, Lixin
AU - Zhang, Na
AU - Zhou, Junpeng
N1 - Publisher Copyright:
© 2021 Elsevier Inc.
PY - 2022/1
Y1 - 2022/1
N2 - In this paper, we consider a class of structured fractional minimization problems where the numerator part of the objective is the sum of a convex function and a Lipschitz differentiable (possibly) nonconvex function, while the denominator part is a convex function. By exploiting the structure of the problem, we propose a first-order algorithm, namely, a proximal-gradient-subgradient algorithm with backtracked extrapolation (PGSA_BE) for solving this type of optimization problem. It is worth pointing out that there are a few differences between our backtracked extrapolation and other popular extrapolations used in convex and nonconvex optimization. One of such differences is as follows: if the new iterate obtained from the extrapolated iteration satisfies a backtracking condition, then this new iterate will be replaced by the one generated from the non-extrapolated iteration. We show that any accumulation point of the sequence generated by PGSA_BE is a critical point of the problem regarded. In addition, by assuming that some auxiliary functions satisfy the Kurdyka-Łojasiewicz property, we are able to establish global convergence of the entire sequence, in the case where the denominator is locally Lipschitz differentiable, or its conjugate satisfies the calmness condition. Finally, we present some preliminary numerical results to illustrate the efficiency of PGSA_BE.
AB - In this paper, we consider a class of structured fractional minimization problems where the numerator part of the objective is the sum of a convex function and a Lipschitz differentiable (possibly) nonconvex function, while the denominator part is a convex function. By exploiting the structure of the problem, we propose a first-order algorithm, namely, a proximal-gradient-subgradient algorithm with backtracked extrapolation (PGSA_BE) for solving this type of optimization problem. It is worth pointing out that there are a few differences between our backtracked extrapolation and other popular extrapolations used in convex and nonconvex optimization. One of such differences is as follows: if the new iterate obtained from the extrapolated iteration satisfies a backtracking condition, then this new iterate will be replaced by the one generated from the non-extrapolated iteration. We show that any accumulation point of the sequence generated by PGSA_BE is a critical point of the problem regarded. In addition, by assuming that some auxiliary functions satisfy the Kurdyka-Łojasiewicz property, we are able to establish global convergence of the entire sequence, in the case where the denominator is locally Lipschitz differentiable, or its conjugate satisfies the calmness condition. Finally, we present some preliminary numerical results to illustrate the efficiency of PGSA_BE.
KW - Backtracked extrapolation
KW - Fractional programming
KW - Proximal algorithm
KW - Sparse recovery
UR - http://www.scopus.com/inward/record.url?scp=85113403674&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85113403674&partnerID=8YFLogxK
U2 - 10.1016/j.acha.2021.08.004
DO - 10.1016/j.acha.2021.08.004
M3 - Article
AN - SCOPUS:85113403674
SN - 1063-5203
VL - 56
SP - 98
EP - 122
JO - Applied and Computational Harmonic Analysis
JF - Applied and Computational Harmonic Analysis
ER -