### Achievement

We developed a Nonlinear Level-set Learning (NLL) method for dimensionality reduction in high-dimensional function approximation with small data. The main contributions of this effort can be summarized as follows: (a) Development of RevNet-based coordinate transformation model for capturing the geometry of level sets, which extends function dimensionality reduction to the nonlinear regime. (b) Design of a new loss function that exploits gradient of the target function to successfully train the proposed RevNet-based nonlinear transformation. (c) Demonstration of the performance of the proposed NLL method on a high-dimensional real-world composite material design problem for rocket inter-stage manufacture.

### Significance and Impact

The new approach can be used for a wide range of applications (and even experimental data), such as acceleration of the design process of multi-layer composite shells, which are used in pressure vessels, reservoirs and tanks, and rocket and spacecraft parts, by determining optimum ply angles.

### Overview

We developed a Nonlinear Level-set Learning (NLL) method for dimensionality reduction in high-dimensional function approximation with small data. This work is motivated by a variety of design tasks in real-world engineering applications, where practitioners would replace their computationally intensive physical models (e.g., high-resolution fluid simulators) with fast-to-evaluate predictive machine learning models, so as to accelerate the engineering design processes. There are two major challenges in constructing such predictive models: (a) high-dimensional inputs (e.g., many independent design parameters) and (b) small training data, generated by running extremely time-consuming simulations. Thus, reducing the input dimension is critical to alleviate the over-fitting issue caused by data insufficiency. Existing methods, including sliced inverse regression and active subspace approaches, reduce the input dimension by learning a linear coordinate transformation; our main contribution is to extend the transformation approach to a nonlinear regime. Specifically, we exploit reversible networks (RevNets) to learn nonlinear level sets of a high-dimensional function and parameterize its level sets in low-dimensional spaces. A new loss function was designed to utilize samples of the target functions' gradient to encourage the transformed function to be sensitive to only a few transformed coordinates. The NLL approach is demonstrated by applying it to three 2D functions and two 20D functions for showing the improved approximation accuracy with the use of nonlinear transformation, as well as to an 8D composite material design problem for optimizing the buckling-resistance performance of composite shells of rocket inter-stages.