Event

Globally Injective Deep Neural Networks

Dr. Michael Puthawala
Globally injective deep neural networks

Abstract: We present an analysis of injective, ReLU, deep neural networks. We establish sharp conditions for injectivity of ReLU layers and networks, both fully connected and convolutional. We show, through a layer-wise analysis, that an expansivity factor of two is necessary for injectivity. We also show sufficiency by constructing weight matrices which guarantee injectivity. Further, we show that global injectivity with iid Gaussian matrices, a commonly used tractable model, requires considerably larger expansivity. We then derive the inverse Lipschitz constant and study the approximation-theoretic properties of injective neural networks. Using arguments from differential topology we prove that, under mild technical conditions, any Lipschitz map can be approximated by an injective neural network. This justifies the use of injective neural networks in problems which a priori do not require injectivity.
Joint work with M. deHoop, K. Kothari, M. Lassas and I. Dokmani\'{c}.

Bio: Michael Puthawala graduated from Rensselaer Polytechnic Institute (RPI) in 2014 with BS in math, and then from UCLA in 2019 with a PhD in Applied Math under Dr. Stanley Osher. During his education he did five separate internships at MIT Lincoln Lab, Oak Ridge National Lab, and Google. Now he is a Simons Postdoctoral Fellow at Rice University under Maarten V. de Ho

Last Updated: August 12, 2020 - 1:31 pm