High-Dimensional Optimization with a Novel Nonlocal Gradient

Dr. Hoang Tran

Abstract: The problem of minimizing multi-modal loss functions with a large number of local optima frequently arises in machine learning and model calibration problems.  Since the local gradient points to the direction of the steepest slope in an infinitesimal neighborhood, an optimizer guided by the local gradient is often trapped in a local minimum.  To address this issue, we develop a novel nonlocal gradient to skip small local minima by capturing major structures of the loss's landscape in black-box optimization.  The nonlocal gradient is defined by a directional Gaussian smoothing (DGS) approach.  The key idea of DGS is to conduct 1D long-range exploration with a large smoothing radius along d orthogonal directions in R^d, each of which defines a nonlocal directional derivative as a 1D integral.  Such long-range exploration enables the nonlocal gradient to skip small local minima.  The d directional derivatives are then assembled to form the nonlocal gradient.  We use the Gauss-Hermite quadrature rule to approximate the d 1D integrals to obtain an accurate estimator.  We provide a convergence theory in the scenario where the objective function is composed of a convex function perturbed by a highly oscillating, deterministic noise.  We prove that our method exponentially converges to a tightened neighborhood of the solution, whose size is characterized by the noise wavelength. The performance of our method is demonstrated and evaluated in several high-dimensional benchmark tests, machine learning and model calibration problems.

Speaker’s Bio: Hoang Tran is currently a Research Scientist with the Data Analysis and Machine Learning Group within the Computer Science and Mathematics Division, Oak Ridge National Laboratory.  His research interests include compressed sensing, optimization for machine learning, high-dimensional approximations, numerical solution of partial differential equations, and computational fluid dynamics.  He received his Ph.D. in Applied Mathematics from the University of Pittsburgh in 2013, under the supervision of Catalin Trenchea and William Layton.

Last Updated: November 28, 2023 - 10:33 am