Subgradient Methods in Infinite Dimensional Hilbert Spaces

Người báo cáo: Hong-Kun Xu (Hangzhou Dianzi University, China)


Thời gian: 930-10h30 Thứ Tư, ngày 19/7/2013

Địa điểm: Phòng 612, nhà A6

Tóm tắt: Subgradient methods, introduced by Shor and developed by Albert, Iusem, Nesterov, Polyak, Soloov, and many others, are used to solve nondifferentiable optimization problems. The major differences from the gradient descent methods (or projection-gradient methods) for differentiable optimization problems lie in the selection manners of the step-sizes. For instance, constant step-sizes for differen-tiable objective functions no longer work for nondifferentiable objective functions; for the latter case, diminishing step-sizes must however be adopted. In this talk, we will first review some existing projected subgradient methods and the main purpose is to discuss weak and strong convergence of projected subgradient methods in an infinite-dimensional Hilbert space. Some regularization techniques for strong convergence of projected subgradient methods will particu-larly be presented. Extension to the proximal-subgradient method for minimizing the sum of two nondifferentiable convex functions will also be discussed.

  Hoạt động tuần
Hội thảo sắp diễn ra
Xuất bản mới
Vo Si Trong Long, Nguyễn Mậu Nam, Jacob Sharkansky, Nguyễn Đông Yên, Qualitative properties of k-center problems, Journal of Optimization Theory and Applications Vol. 207 (2025), Paper 1, 23 pages (SCI-E, Scopus) .
Nguyễn Khoa Sơn, Nguyễn Thị Hồng, Lê Văn Ngọc, Stability conditions for a class of nonlinear timevarying switched systems with delays and sectortype nonlinearities, International Journal of Systems Science, Volume 57(2), (2025), 441-461 (SCI(-E); Scopus) .
Trần Văn Thắng, Lê Xuân Thanh, Đỗ Thị Thùy, A monotonic optimization approach to mixed variational inequality problems, Optimization Letters, Volume 19, pages 1779–1800, (2025) (SCI-E, Scopus) .