2021 International Joint Conference on Neural Networks (IJCNN) | 2021

A Meta-Learning Approach for Automated Hyperparameter Tuning in Evolving Data Streams

 
 
 
 

Abstract


Most machine learning algorithms rely on a set of hyperparameters, which are tuned either by domain experts or using automated methods. In the context of data streams, the complexity of this task is compounded by the need for tuning every time a concept drift occurs. The problem of automated tuning may lead to classification performance improvements. Current methods optimise the hyperparameters from scratch every time concept drift is detected. Additionally, meta-learning techniques developed for static machine learning tasks allow pre-built knowledge to adapt faster to new tasks. We propose a meta-learning approach for automated tuning in evolving data streams, specifically for tuning parameters in Adaptive Random Forest (ARF) and parameters for several drift detectors. We compare our approach against non-adaptive methods on both synthetic and real-world datasets. Empirical results show that our approach makes a compromises between predictive performance and resource consumption for ARF, and between true positives and false positives for drift detectors.

Volume None
Pages 1-8
DOI 10.1109/IJCNN52387.2021.9533842
Language English
Journal 2021 International Joint Conference on Neural Networks (IJCNN)

Full Text