Abstract: Physics-informed machine learning (PIML) aims to incorporate physics knowledge into deep neural networks (DNNs) to improve the model generalization and interpretability. However, existing methods in PIML are either designed for specific problems or hard to interpret the results using black-box DNNs. Can we develop a generic interpretable neural architecture that can be used in a wide range of domains? In this talk, I will present the Taylor Neural Network (TaylorNet), a novel generic neural architecture that parameterizes Taylor polynomials using DNNs without non-linear activation functions. I will introduce how to overcome the curse of dimensionality and improve the stability of model training by developing such neural architecture. I will show that the developed TaylorNet can be applied to a variety of application domains, including dynamical systems, image classification, and natural language processing. First, it can achieve performance on par with the state-of-the-art in image classification and natural language processing applications while remarkably reducing model parameters. More importantly, I will show that the TaylorNet can explicitly learn and interpret some classic dynamical systems with Taylor polynomials, making way for interpretable machine learning for scientific discovery. I will conclude by discussing ongoing work and future directions for TaylorNet.
Speaker’s Bio: Dr. Huajie Shao is a tenure-track assistant professor of Computer Science at the College of William and Mary. Before that, he obtained his Ph.D. degree in Computer Science from the University of Illinois at Urbana Champaign in 2021.
Dr. Shao's research interests focus on physics-guided machine learning, machine learning for scientific discovery, and interpretable machine learning. Thus far, he has published more than 40 papers in top-tier conferences and journals, such as The International Conference on Machine Learning, Computer Vision and Pattern Recognition Conference, Association for Computational Linguistics, Transaction on Pattern Analysis and Machine Learning, Conference on Very Large Data Bases, WWW, Annual Conference of the Association for Computing Machinery Special Interest Group in Information Retrieval, and the International Conference on Computer Communications. He also received SenSys’20 and ICCPS’17 Best Paper Award, FUSION’19 Student Paper Award, and UbiComp’19 Distinguished Paper Award.
Last Updated: January 17, 2023 - 8:12 am