Page 5 - 第四届运筹青年论坛会议手册-0615
P. 5
邀请报告 | 第四届中国运筹青年论坛 03
Convex Quadratic Programming: Restricted Wolfe Dual and
Symmetric Gauss-Seidel Decomposition Theorem
孙德锋 香港理工大学
Abstract: While the Lagrange dual of linear programming is standard, it is not the case for convex
quadratic programming. This ambiguity is caused by the rank deficiency of the Hessian of the
objective function. In this talk, we shall introduce a restricted Wolfe dual for convex quadratic
programming to address this issue. This restricted Wolfe dual possesses many nice theoretical
properties that resembles those of linear conic programming and allows one to design dual based
augmented Lagrangian methods with guaranteed convergence. Interestingly, our study on the
restricted Wolfe dual has also led to the proof of a symmetric Gauss-Seidel decomposition theorem,
which in turn has helped the design of efficient accelerated proximal gradient methods for least
squares optimization problems and the alternating direction methods of multipliers for multi-block
convex optimization problems with linear constraints.
Biodata: Professor Defeng Sun is currently Chair Professor of Applied Optimization and Operations
Research at the Hong Kong Polytechnic University. He mainly publishes in convex and non-
convex continuous optimization. Together with Professor Kim-Chuan Toh and Dr Liuqin Yang, he
was awarded the triennial 2018 Beale--Orchard-Hays Prize for Excellence in Computational
Mathematical Programming by the Mathematical Optimization Society. He served as editor-in-chief
of Asia-Pacific Journal of Operational Research from 2011 to 2013 and he now serves as associate
editor of Mathematical Programming, SIAM Journal on Optimization, Journal of the Operations
Research Society of China, Journal of Computational Mathematics, and Science China:
Mathematics. He was elected as a SIAM Fellow in 2020.