Event

Gaussian Processes, Dimension Reduction, Approximation, and More

Dr. Josiah Park

Abstract: In this talk, we delve into the interplay of Gaussian Processes (GPs), approximation theory, and dimension reduction.  The goal is to present an introduction of GPs, a popular choice in the field of machine learning for regression problems.  We will establish a basic understanding of GPs, and their properties, and highlight some of the computational challenges of using GPs on big data.  We then will transition into connections with GPs and approximation theory, showing some examples of how the choice of kernel function in Gaussian process regression (GPR) affects function estimation.  We may use insights from approximation theory to help understand and analyze the behavior of GPR.  We finally transition into the concept of dimension reduction, discussing some examples of intrinsically low-dimensional or ‘sparse’ data (like natural images).  We compare kernel PCA, t-distributed Stochastic Neighbor Embedding (t-SNE), and Uniform Manifold Approximation and Projection and discuss connections with GPR.  The talk is designed to provide an understanding of these mathematical tools and how they apply to a broad range of disciplines including machine learning, data science, and computational imaging.

Speaker’s Bio: Josiah Park, Ph. D. is a Postdoctoral Researcher at Lawrence Berkeley National Laboratory and was a National Science Foundation TRIPODS postdoctoral fellow at Texas A&M University (2020-2022) and was supervised by Professors Simon Foucart and Ronald DeVore.  Prior to this, he received his doctorate in mathematics at the Georgia Institute of Technology in 2020 under the supervision of Professors Christopher Heil and Michael Lacey.

Last Updated: May 16, 2023 - 3:15 pm