**The Center For Advanced Mathematical Sciences invites you to a seminar entitled "Enlarged Krylov Subspace Conjugate Gradient Methods for Reducing Communication" by Sophie Moufawad, AUB.**

**Abstract:**

We are interested in solving sparse systems of linear equations Ax =b, that mainly arise from the discretization of partial differential equations in numerical simulations, such as petroleum basin and reservoir simulations. These sparse systems are very large of size at least 10^6 . Hence, they are not solved using direct methods, which are prohibitive in terms of memory and flops, but rather using iterative methods that take advantage of the sparsity of the matrix, such as the Krylov Subspace subspace methods. When implementing such methods on modern High Performance Computers (HPC), one must take into account the fact that the cost of communication, or data movement between computer memory (memories) and processing unit(s), is much higher than the cost of arithmetic operations (processing time). As a result, communication is often the bottleneck in these numerical algorithms. Recent research has focused on reducing communication by reformulating Krylov subspace methods based on the so called s-step methods, such as communication avoiding GMRES (CA-GMRES) and communication avoiding CG (CA-CG) [3, 4]. However, there are very few good preconditioners, such as CA-ILU0 [1], that can be used with these “communication avoiding” methods. This represents a serious limitation for the communication avoiding methods since for ill-conditioned systems, unpreconditioned Krylov subspace methods can be very slow or even might not converge. We present a new approach for reducing communication in Krylov subspace methods [2], that consists of enlarging the Krylov subspace by a maximum of t vectors per iteration, based on a domain decomposition of the graph of A. The obtained enlarged Krylov subspace is a superset of the Krylov subspace. Therefore, we search for the solution of the system Ax = b in the enlarged Krylov subspace. Unlike communication avoiding methods, any preconditioner can be applied to enlarged Krylov subspace methods. Thus, we introduce several enlarged conjugate gradient versions, such as MSDO-CG and SRE-CG. These versions converge faster than CG in terms of iterations, since the solution is sought from the enlarged subspace, which is larger than the classical Krylov subspace. In addition, the enlarged CG versions are parallelizable and reduce communication.

References

[1] L. Grigori, S. Moufawad. Communication Avoiding ILU0 preconditioner, SIAM Journal on Scientific Computing, 37, pp. C217C246, 2015.

[2] L. Grigori and S. Moufawad and F. Nataf. Enlarged Krylov Subspace Conjugate Gradient methods for Reducing Communication. SIAM Journal on Matrix Analysis and Applications, 37:2, pp. 744773, 2016.

[3] M. Hoemmen. Communication-Avoiding Krylov Subspace Methods. PhD thesis, EECS Department, University of California, Berkeley, 2010.

[4] M. Mohiyuddin, M. Hoemmen, J. Demmel, and K. Yelick. Minimizing communication in sparse matrix solvers. In Proceedings of the Conference on High Performance Computing Networking, Storage and Analysis, SC 09, ACM, New York, NY, USA, 2009, pp 1-12.