Event

Communication-avoiding Algorithms for Linear Algebra, Machine Learning and Beyond

Dr. James Demmel

Algorithms have two costs: arithmetic and communication, i.e. moving data between levels of a memory hierarchy or processors over a network. Communication costs (measured in time or energy per operation) already greatly exceed arithmetic costs, and the gap is growing over time following technological trends. Thus our goal is to design algorithms that minimize communication. We present new algorithms that communicate asymptotically less than their classical counterpoints, for a variety of linear algebra and machine learning problems, demonstrating large speedup on a variety of architectures. Some of these algorithms attain provable lower bounds on communication. We describe a generalization of these lower bounds and optimal algorithms to arbitrary code that can be expressed as nested loops accessing arrays, assuming only that array subscripts are affine functions of the loop indices, a special case being convolutional neural nets. 

Bio:  James Demmel is the Dr. Richard Carl Dehmel Distinguished Professor of Computer Science and Mathematics at the University of California at Berkeley, and Chair of the EECS Department. His research is in numerical linear algebra, HPC, and communication avoiding algorithms. He is known for his work on the LAPACK and ScaLAPACK linear algebra libraries. He is a member of the NAS, NAE, and American Academy of Arts and Sciences; a Fellow of the AAAS, ACM, AMS, IEEE, and SIAM; and a winner of the IPDPS Charles Babbage Award, IEEE Computer Society Sidney Fernbach Award, the ACM Paris Kanellakis Award, and numerous best paper prizes. 

Last Updated: May 28, 2020 - 4:06 pm