Abstract: We propose a neural network-based approach to the homogenization of multiscale problems. The proposed method uses a derivative-free formulation of a training loss, which incorporates Brownian walkers to find the macroscopic description of a multiscale partial differential equation (PDE) solution. Compared with other network-based approaches for multiscale problems, the proposed method is free from the design of hand-crafted neural network architecture and the cell problem to calculate the homogenization coefficient. The exploration neighborhood of the Brownian walkers affects the overall learning trajectory. We determine the bounds of micro- and macro-time steps that capture the local heterogeneous and global homogeneous solution behaviors, respectively, through a neural network. The bounds imply that the computational cost of the proposed method is independent of the microscale periodic structure for the standard periodic problems. We validate the efficiency and robustness of the proposed method through a suite of linear and nonlinear multiscale problems with periodic and random field coefficients.
Speaker’s Bio: Jihun Han received his Ph.D. in Applied Mathematics from Courant Institute, NYU, in 2017. After his Ph.D. training, he worked as a data scientist at Samsung Fire and Marine Insurance Co. Ltd., mainly involved in developing deep learning algorithms for industrial purposes. He conducted postdoctoral research at the University of Toronto in 2019-2020, and since joining Dartmouth College in 2020, he has continued his work in the field of scientific machine learning. His current research expands to neural network approaches for solving PDEs, inverse problems with limited data, and learning complex dynamical systems.
Last Updated: May 16, 2023 - 3:19 pm