Invited Talk @GECCO: Hyperparameter Tuning for Machine and Deep Learning with R

Hyperparameter Tuning for Machine and Deep Learning with R

Thomas Bartz-Beielstein will present some ideas from the book “Hyperparameter Tuning for Machine and Deep Learning with R – A Practical Guide” at GECCO2022 during the AABOH Workshop. His talk starts at 17:40 (Berlin) / 11:40 (Boston’s time zone: UTC/GMT-4, Eastern Daylight Time).

AABOH – The Analysing Algorithmic Behaviour of Optimisation Heuristics Workshop

The Analysing Algorithmic Behaviour of Optimisation Heuristics Workshop (AABOH) is part of the 2022 Genetic and Evolutionary Computation Conference (GECCO’22).

About. Optimisation and Machine Learning tools are among the most used tools in the modern world with its omnipresent computing devices. Yet, the dynamics of these tools have not been analysed in detail. Such scarcity of knowledge on the inner workings of heuristic methods is largely attributed to the complexity of the underlying processes that cannot be subjected to a complete theoretical analysis. However, this is also partially due to a superficial experimental set-up and, therefore, a superficial interpretation of numerical results. Indeed, researchers and practitioners typically only look at the final result produced by these methods. Meanwhile, the vast amount of information collected over the run(s) is wasted. In the light of such considerations, it is now becoming more evident that such information can be useful and that some design principles should be defined that allow for online or offline analysis of the processes taking place in the population and their dynamics.

Hence, with this workshop, we discuss  both theoretical and empirical achievements identifying the desired features of optimisation and machine learning algorithms, quantifying the importance of such features, spotting the presence of intrinsic structural biases and other undesired algorithmic flaws, studying the transitions in algorithmic behaviour in terms of convergence, any-time behaviour, performances, robustness, etc., with the goal of gathering the most recent advances to fill the aforementioned knowledge gap and disseminate the current state-of-the-art within the research community.

Topics of Interest

Submissions present carefully designed experiments or data-heavy approaches that can come to help in analysing primary algorithmic behaviours and modelling internal dynamics causing them. As an indication, some (but not all) relevant topics of interests are reported in the list below:

  • global search vs. local search,
  • exploration vs. exploitation,
  • time and space complexity,
  • premature convergence and stagnation,
  • structural bias,
  • genotypic or phenotypic diversity,
  • robustness of the produced solution,
  • secondary benchmarking,
  • anytime performance.

Program

The AABOH workshop will take place on Sunday, July 10, 08:30-12:40 (EDT) and will consist of 2 sessions

Session 1: Contributed papersTime
Welcome Talk08:30
Survivor Selection in a Crossoverless Evolutionary Algorithm08:35
     Nielis Brouwer, Danny Dijkzeul, Levi Koppenhol, Iris Pijning, Daan van den Berg
Exactly characterizable parameter setings in a crossoverless evolutionary algorithm08:50
     Levi Koppenhol, Nielis Brouwer, Danny Dijkzeul, Iris Pijning, Joeri Sleegers, Daan van den Berg
Examining Algorithm Behavior using Recurrence Quantification and Landscape Analyses09:05
     Mario Munoz Acosta
The Effect of Decoding Fairness on Particle Median Problem09:20
     Pavel Kromer, Vojtech Uher
Dynamic Computational Resource Allocation for CFD Simulations Based on Pareto Front Optimization09:35
     Gašper Petelin, Margarita Antoniou, Gregor Papa
Using Structural Bias to Analyse the Behaviour of Modular CMA-ES09:50
     Diederick Vermetten, Fabio Caraffini, Bas van Stein, Anna Kononova
Closing10:05
Session 2: Theoretical and Empirical Analysis of Optimisation Heuristics
Opening Talk10:50
Invited Talk10:55
     Benjamin Doerr, École Polytechnique, Palaiseau, France
    Title: Don’t Implement, Think!
Abstract:
It is clear that the vast majority of algorithm analyses is experimental. In this talk, I shall argue that mathematical analyses of algorithms have a few undeniable advantages. I shall then argue that theory vs. experiments is not an exclusive-or. Rather, you should use your mathematical skills to understand as much as possible via theoretical means (allowing any degree of imprecision or unproven assumptions that is necessary to survive). Then, and only then, is the time to design a meaningful experiment that does not just blindly collect a mass of ununderstandable data, but that confirms (or disproves) the theories obtained before. Of course you should implement, but backed up by as much theory as possible!
Invited Talk11:40
     Thomas Bartz-Beielstein, TH Koeln, Institute for Data Science, Engineering, and Analytics, Germany
    Title: Hyperparameter Tuning of Deep Neural Networks
Abstract:
A surrogate model based Hyperparameter Tuning (HPT) approach for Deep Learning (DL) is presented. We will demonstrate how the architecture- level parameters (hyperparameters) of Deep Neural Networks (DNNs) that were implemented in keras/tensorflow can be optimized. The implementation of the tuning procedure is 100% accessible from R, the software environment for statistical computing. The performances of six Machine Learning (ML) methods (k-Nearest-Neighbor (KNN), Elastic Net (EN), Decision Tree (DT), Random Forest (RF), Extreme Gradient Boosting (XGBoost), and Support Vector Machine (SVM)) are compared to the results from the DNN. The R package SPOT is used as a “datascope” to analyze the results from the HPT runs from several perspectives: in addition to Classification and Regression Trees (CART), the analysis combines results from surface, sensitivity and parallel plots with a classical regression analysis. This study provides valuable insights in the tunability of several ML and DL methods, which is of great importance for the AI practitioner. This keynote presents results from the forthcoming book “Hyperparameter Tuning for Machine and Deep Learning with R“, which is edited by Eva Bartz, Thomas Bartz-Beielstein, Martin Zaefferer, and Olaf Mersmann that will be published by Springer. 
Panel Discussion12:25
Closing Remarks12:35