Publication

Optimizing deep learning hyper-parameters through an evolutionary algorithm

Citation

Optimizing deep learning hyper-parameters through an evolutionary algorithm; MLHPC '15 Proceedings of the Workshop on Machine Learning in High-Performance Computing Environments 2015

Abstract

There has been a recent surge of success in utilizing Deep Learning (DL) in imaging and speech applications for its relatively automatic feature generation and, in particular for convolutional neural networks (CNNs), high accuracy classification abilities. While these models learn their parameters through data-driven methods, model selection (as architecture construction) through hyper-parameter choices remains a tedious and highly intuition driven task. To address this, Multi-node Evolutionary Neural Networks for Deep Learning (MENNDL) is proposed as a method for automating network selection on computational clusters through hyper-parameter optimization performed via genetic algorithms.

Read Publication

Last Updated: May 28, 2020 - 4:05 pm