Authors: Thomas Bartz-Beielstein, Frederik Rehbach, Amrita Sen, and Martin Zaefferer
ACM-class: A.1; B.8.0; G.1.6; G.4; I.2.8
A surrogate model based hyperparameter tuning approach for deep learning is
presented. This article demonstrates how the architecture-level parameters
(hyperparameters) of deep learning models that were implemented in
Keras/tensorflow can be optimized. The implementation of the tuning procedure
is 100% accessible from R, the software environment for statistical computing.
With a few lines of code, existing R packages (tfruns and SPOT) can be combined
to perform hyperparameter tuning. An elementary hyperparameter tuning task
(neural network and the MNIST data) is used to exemplify this approach.