• Conference
  • Engineering and Numerical Tools

Conférence : Communications avec actes dans un congrès international

Neural networks are widely used in the literature
in a variety of fields and for a large number of applications. A
major challenge in their use is the need to identify and process
hyperparametric values. Grid Search is a widely used technique
for meeting this task. It systematically searches for values in
a predefined range of hyperparameters. However, selecting the
appropriate range of hyperparameters can be difficult, as the
search space can be vast, resulting in an extensive number of
combinations to be tested. It is more suited to short, fast searches
for hyperparameter values, within ranges which are known to be
generally efficient. In this paper, we present an improvement to
Grid Search using a BootBOGS, a bootstrap based approach to
hyperparameter optimization. BootBOGS is a hybrid approach
that combines bootstrap and Bayesian Search with the Grid
Search technique to perform an efficient search in hyperparam eter space. Bayesian Search is used to initialize hyperparameter
ranges. Bootstrap is used to explore the distribution of model
performance for each hyperparameter combination and to reduce
its variance, enabling us to better understand the margins of
these hyperparameters and to reduce these ranges. Grid Search
is then used to refine the selection of hyperparameters. To
evaluate the effectiveness of the proposed approach, a set of
computational experiments are carried out on four different
datasets from classification problems, for which we compared
BootBOGS to several other strategies: Grid Search, Random
Search, and Bayesian Optimization. The results show that our
method is able to find better hyperparameter configurations in
terms of predictive quality with a reasonable runtime and lead
to more robust and reliable hyperparameter tuning processes