A survey on multi-objective hyperparameter optimization algorithms for machine learning

Meta-heuristic FOS: Computer and information sciences metamodel Computer Science - Machine Learning Computer Science - Artificial Intelligence 02 engineering and technology meta-heuristic Machine Learning (cs.LG) Multi-objective optimization 03 medical and health sciences machine learning Artificial Intelligence (cs.AI) 0302 clinical medicine multi-objective optimization Optimization and Control (math.OC) Machine learning Metamodel FOS: Mathematics 0202 electrical engineering, electronic engineering, information engineering hyperparameter optimization Hyperparameter optimization Mathematics - Optimization and Control
DOI: 10.1007/s10462-022-10359-2 Publication Date: 2022-12-24T07:02:36Z
ABSTRACT
AbstractHyperparameter optimization (HPO) is a necessary step to ensure the best possible performance of Machine Learning (ML) algorithms. Several methods have been developed to perform HPO; most of these are focused on optimizing one performance measure (usually an error-based measure), and the literature on such single-objective HPO problems is vast. Recently, though, algorithms have appeared that focus on optimizing multiple conflicting objectives simultaneously. This article presents a systematic survey of the literature published between 2014 and 2020 on multi-objective HPO algorithms, distinguishing between metaheuristic-based algorithms, metamodel-based algorithms and approaches using a mixture of both. We also discuss the quality metrics used to compare multi-objective HPO procedures and present future research directions.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (186)
CITATIONS (84)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....