Abstract: In this talk, we consider nonlinear optimization problems that involve surrogate models represented by neural networks. We demonstrate how to directly embed neural network evaluation into optimization models, and highlight a difficulty with this approach that can prevent convergence. We then present two alternative formulations of these problems as mixed-integer optimization problems, and as optimization problems with complementarity constraints. Each of these formulations may be solved with state-of-the-art optimization methods, and we show how to obtain good initial feasible solutions for these methods. We compare our formulations on three practical applications arising in the design and control of combustion engines, in the generation of adversarial attacks on classifier networks, and in the determination of optimal flows in an oil well network.
Speaker’s Bio: Sven Leyffer joined the Mathematics and Computer Science division at Argonne in 2002, where he is now a senior computational mathematician. Sven is a Society for Industrial and Applied Mathematics Fellow, and a senior fellow of the University of Chicago/Argonne Computation Institute. He is the current International Council for Industrial and Applied Mathematics Secretary and serves on the editorial boards of Computational Optimization and Applications, and Mathematics of Computation. He is also editor-in-chief of Mathematical Programming Series B. Leyffer obtained his Ph.D. in 1994 from the University of Dundee, Scotland, and held postdoctoral research positions at Dundee, Argonne, and Northwestern University.
- Development of reliable methods for solving large-scale nonlinear optimization problems
- Implementation and analysis of filter-type algorithms
- Extending nonlinear optimization methodologies to emerging areas such as mixed-integer nonlinear optimization and optimization problems with complementarity constraints
Last Updated: October 8, 2021 - 10:51 am