Please use this identifier to cite or link to this item: http://hdl.handle.net/2080/5763
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBarik, Milan-
dc.contributor.authorPattanaik, Suvendu Ranjan-
dc.date.accessioned2026-04-02T12:32:12Z-
dc.date.available2026-04-02T12:32:12Z-
dc.date.issued2026-03-
dc.identifier.citation1st International Conference on Mathematical Optimization Theory and Applications, IIT BHU, Varanasi, India, 14-16 March 2026en_US
dc.identifier.urihttp://hdl.handle.net/2080/5763-
dc.descriptionCopyright belongs to proceeding publisher.en_US
dc.description.abstractNesterov’s accelerated gradient (NAG) method extends the classical gradient descent algorithm by improving the convergence rate from O(1/t ) to O(1/t2 ) in convex optimization. In this work, we study the proximal gradient framework for additively separable composite objectives consisting of smooth and non-smooth terms. We show that the Nesterov accelerated proximal gradient method (NAPGα) achieves a convergence rate of o(1/t2 ) for strong–weak convex functions when α > 3. A Lyapunov-based analysis is developed to establish the fast convergence of the composite gradient operator in the setting where the smooth component is strongly convex and the non-smooth component is weakly convex. Furthermore, we prove the equivalence between the Nesterov accelerated proximal gradient method and the Ravine accelerated proximal gradient scheme.en_US
dc.language.isoen_USen_US
dc.publisherIIT BHUen_US
dc.subjectNestrov accelerated gradient methoden_US
dc.subjectRavine methoden_US
dc.subjectProximal gradient methoden_US
dc.titleConvergence Guarantees for First-Order Methods via Lyapunov Analysis in Composite Strong-Weak Convex Optimizationen_US
dc.typePresentationen_US
Appears in Collections:Conference Papers

Files in This Item:
File Description SizeFormat 
2026_ICMOTA_MBarik_Convergence.pdfPresentation3.35 MBAdobe PDFView/Open    Request a copy


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.