Please use this identifier to cite or link to this item:
http://hdl.handle.net/2080/5763| Title: | Convergence Guarantees for First-Order Methods via Lyapunov Analysis in Composite Strong-Weak Convex Optimization |
| Authors: | Barik, Milan Pattanaik, Suvendu Ranjan |
| Keywords: | Nestrov accelerated gradient method Ravine method Proximal gradient method |
| Issue Date: | Mar-2026 |
| Publisher: | IIT BHU |
| Citation: | 1st International Conference on Mathematical Optimization Theory and Applications, IIT BHU, Varanasi, India, 14-16 March 2026 |
| Abstract: | Nesterov’s accelerated gradient (NAG) method extends the classical gradient descent algorithm by improving the convergence rate from O(1/t ) to O(1/t2 ) in convex optimization. In this work, we study the proximal gradient framework for additively separable composite objectives consisting of smooth and non-smooth terms. We show that the Nesterov accelerated proximal gradient method (NAPGα) achieves a convergence rate of o(1/t2 ) for strong–weak convex functions when α > 3. A Lyapunov-based analysis is developed to establish the fast convergence of the composite gradient operator in the setting where the smooth component is strongly convex and the non-smooth component is weakly convex. Furthermore, we prove the equivalence between the Nesterov accelerated proximal gradient method and the Ravine accelerated proximal gradient scheme. |
| Description: | Copyright belongs to proceeding publisher. |
| URI: | http://hdl.handle.net/2080/5763 |
| Appears in Collections: | Conference Papers |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| 2026_ICMOTA_MBarik_Convergence.pdf | Presentation | 3.35 MB | Adobe PDF | View/Open Request a copy |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.
