A simple convergence analysis of Bregman proximal gradient algorithm

Yi Zhou, Yingbin Liang, Lixin Shen

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

In this paper, we provide a simple convergence analysis of proximal gradient algorithm with Bregman distance, which provides a tighter bound than existing result. In particular, for the problem of minimizing a class of convex objective functions, we show that proximal gradient algorithm with Bregman distance can be viewed as proximal point algorithm that incorporates another Bregman distance. Consequently, the convergence result of the proximal gradient algorithm with Bregman distance follows directly from that of the proximal point algorithm with Bregman distance, and this leads to a simpler convergence analysis with a tighter convergence bound than existing ones. We further propose and analyze the backtracking line-search variant of the proximal gradient algorithm with Bregman distance.

Original languageEnglish (US)
Pages (from-to)903-912
Number of pages10
JournalComputational Optimization and Applications
Volume73
Issue number3
DOIs
StatePublished - Jul 1 2019

Keywords

  • Bregman distance
  • Convergence analysis
  • Line-search
  • Proximal algorithms

ASJC Scopus subject areas

  • Control and Optimization
  • Computational Mathematics
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'A simple convergence analysis of Bregman proximal gradient algorithm'. Together they form a unique fingerprint.

Cite this