Project

Black-box training for scientific machine learning models

Project Status: Active

Project Summary: We propose to develop a scalable black-box training framework for scientific machine learning (SciML) models that are non-trainable with existing automatic differentiation-based algorithms. Our particular interest to this effort is to study how to train data-driven SciML models to learn missing physics of a complex system for advancing forward simulations.  Specifically, this effort aims at achieving the following objectives: (1) develop a novel non-local gradient with structured sampling to enable non-local exploration for escaping from local minima and to achieve sufficient accuracy in gradient estimation; (2) advance theoretical analyses for a class of non-convex training problems to help domain scientists tune the hyper-parameters of the proposed training framework; and (3) exploit high-performance computing to accelerate the time to solution for black-box training problems for which loss functions involve computationally expensive black-box simulators.  The proposed framework will be demonstrated on two distinct applications.  The first is to train machine learning-based constitutive models to predict mercury dynamics in the Spallation Neutron Source mercury target, and the second is to train heat-source models to predict time-dependent laser scan paths that yield desirable micro-structures in three-dimensional printed metal components. This effort will advance the state of the art of several machine learning areas such as reinforcement learning and variational inference.

Principal Investigator: Guannan Zhang (CSMD, ORNL)

Senior Investigators: , Jiaxin Zhang (CSMD, ORNL), Hoang Tran (CSMD, ORNL), Dan Lu (CSED, ORNL), Matthew Bement (CSED, ORNL), Yousub Lee (CSED, ORNL), Bejamin Stump (MSTD, ORNL), Sirui Bi (CSED, ORNL)

Funding Period: Sept. 2020 to Aug. 2022

DGS_photonics

In Dec. 2020, Jiaxin Zhang gave a presentation on our DGS gradient optimization method at NeurIPS Workshop on Machine Learning for Engineering Modeling, Simulation and Design

Publications: 

  • Jiaxin. Zhang, Hoang. Tran, Dan. Lu, and Guannan. Zhang, A novel evolution strategy with directional Gaussian smoothing for blackbox optimization, submitted. (https://arxiv.org/abs/2002.03001)
  • Jiaxin. Zhang, Hoang. Tran, and Guannan. Zhang, Accelerating Reinforcement Learning with a Directional-Gaussian-Smoothing Evolution Strategy, submitted. (https://arxiv.org/abs/2002.09077)
  • Hoang Tran, and Guannan Zhang, AdaDGS: An adaptive black-box optimization method with a nonlocal directional Gaussian smoothing gradient, submitted. (https://arxiv.org/abs/2011.02009).
  • Hoang Tran, Dan Lu and Guannan Zhang, Boosting black-box adversarial attack via exploiting loss smoothness, Proceedings of ICLR Workshop on Security and Safety in Machine Learning Systems, 2021.
  • Jiaxin Zhang, Sirui Bi, and Guannan Zhang, A directional Gaussian smoothing optimization method for computational inverse design in nanophotonics, Materials & Design, 197 (1), pp. 109213, 2021.
  • Jiaxin Zhang, Sirui Bi, and Guannan Zhang, A nonlocal-gradient descent method for inverse design in nanophotonics, Proceedings of NeurIPS Workshop on Machine Learning for Engineering Modeling, Simulation and Design, Dec. 2020.

Activities:

  • In December 2020, Sirui Bi gave a presentation on "Directional Gaussian Smoothing Optimization for Inverse Design in Nanophotonics" at The Conference on Machine Learning in Science and Engineering (MLSE 2020).
  • In December 2020, Jiaxin Zhang gave a presentation on "A nonlocal-gradient descent method for inverse design in nanophotonics" at NeurIPS 2020 Workshop on Machine Learning for Engineering Modeling, Simulation and Design. [Download our poster] [A short video presentation]

Last Updated: April 6, 2021 - 9:29 am

Seminars