A block symmetric Gauss-Seidel decomposition theorem and its applications in big data nonsmooth optimization

De-Feng Sun

Department of Applied Mathematics

The Hong Kong Polytechnic University

defeng.sun@polyu.edu.hk

    The Gauss-Seidel method is a classical iterative method of solving the linear system `Ax=b`. It has long been known to be convergent when A is symmetric positive definite. In this talk, we shall focus on introducing a symmetric version of the Gauss-Seidel method and its elegant extensions in solving big data nonsmooth optimization problems. For a symmetric positive semidefinite linear system `Ax=b` with `x=(x_1,ldots,x_s)` being partitioned into s blocks, we show that each cycle of the block symmetric Gauss-Seidel (block sGS) method exactly solves the associated quadratic programming (QP) problem but added with an extra proximal term. By leveraging on such a connection to optimization, one can extend the classical convergent result, named as the block sGS decomposition theorem, to solve a convex composite QP (CCQP) with an additional nonsmooth term in `x_1`. Consequently, one is able to use the sGS method to solve a CCQP. In addition, the extended block sGS method has the flexibility of allowing for inexact computation in each step of the block sGS cycle. At the same time, one can also accelerate the inexact block sGS method to achieve an iteration complexity of O(1/`k^2`) after performing k block sGS cycles. As a fundamental building block, the block sGS decomposition theorem has played a key role in various recently developed algorithms such as the proximal ALM/ADMM for linearly constrained multi-block convex composite conic programming (CCCP) and the accelerated block coordinate descent method for multi-block CCCP.