- Machine Learning and Data Classification
- Metaheuristic Optimization Algorithms Research
- Advanced Multi-Objective Optimization Algorithms
- Advanced Bandit Algorithms Research
- Industrial Vision Systems and Defect Detection
- Data Stream Mining Techniques
- Advanced Neural Network Applications
- Machine Learning and Algorithms
- Neural Networks and Applications
- Fault Detection and Control Systems
- Cloud Computing and Resource Management
- Metabolomics and Mass Spectrometry Studies
- Advanced Database Systems and Queries
Peking University
2021-2023
Tencent (China)
2023
King University
2022
Black-box optimization (BBO) has a broad range of applications, including automatic machine learning, engineering, physics, and experimental design. However, it remains challenge for users to apply BBO methods their problems at hand with existing software packages, in terms applicability, performance, efficiency. In this paper, we build OpenBox, an open-source general-purpose service improved usability. The modular design behind OpenBox also facilitates flexible abstraction basic components...
The ever-growing demand and complexity of machine learning are putting pressure on hyper-parameter tuning systems: while the evaluation cost models continues to increase, scalability state-of-the-arts starts become a crucial bottleneck. In this paper, inspired by our experience when deploying in real-world application production limitations existing systems, we propose Hyper-Tune, an efficient robust distributed framework. Compared with Hyper-Tune highlights multiple system optimizations,...
The distributed data analytic system -- Spark is a common choice for processing massive volumes of heterogeneous data, while it challenging to tune its parameters achieve high performance. Recent studies try employ auto-tuning techniques solve this problem but suffer from three issues: limited functionality, overhead, and inefficient search. In paper, we present general efficient tuning framework that can deal with the issues simultaneously. First, introduce generalized formulation, which...
With the extensive applications of machine learning models, automatic hyperparameter optimization (HPO) has become increasingly important. Motivated by tuning behaviors human experts, it is intuitive to leverage auxiliary knowledge from past HPO tasks accelerate current task. In this paper, we propose TransBO, a novel two-phase transfer framework for HPO, which can deal with complementary nature among source and dynamics during aggregation issues simultaneously. This extracts aggregates...
Black-box optimization (BBO) has a broad range of applications, including automatic machine learning, experimental design, and database knob tuning. However, users still face challenges when applying BBO methods to their problems at hand with existing software packages in terms applicability, performance, efficiency. This paper presents OpenBox, an open-source toolkit improved usability. It implements user-friendly inferfaces visualization for define manage tasks. The modular design behind...
The ever-growing demand and complexity of machine learning are putting pressure on hyper-parameter tuning systems: while the evaluation cost models continues to increase, scalability state-of-the-arts starts become a crucial bottleneck. In this paper, inspired by our experience when deploying in real-world application production limitations existing systems, we propose Hyper-Tune, an efficient robust distributed framework. Compared with Hyper-Tune highlights multiple system optimizations,...
Distributed data analytic engines like Spark are common choices to process massive in industry. However, the performance of SQL highly depends on choice configurations, where optimal ones vary with executed workloads. Among various alternatives for tuning, Bayesian optimization (BO) is a popular framework that finds near-optimal configurations given sufficient budget, but it suffers from re-optimization issue and not practical real production. When applying transfer learning accelerate...
The tuning of hyperparameters becomes increasingly important as machine learning (ML) models have been extensively applied in data mining applications. Among various approaches, Bayesian optimization (BO) is a successful methodology to tune hyper-parameters automatically. While traditional methods optimize each task isolation, there has recent interest speeding up BO by transferring knowledge across previous tasks. In this work, we introduce an automatic method design the search space with...
In this paper, we describe our method for tackling the automated hyperparameter optimization challenge in QQ Browser 2021 AI Algorithm Competiton (ACM CIKM AnalyticCup Track 2). The competition organizers provide anonymized realistic industrial tasks and datasets black-box optimization. Based on open-sourced package OpenBox, adopt Bayesian framework configuration sampling a heuristic early stopping strategy. We won first place both preliminary final contests with results of 0.938291...
Distributed data analytic engines like Spark are common choices to process massive in industry. However, the performance of SQL highly depends on choice configurations, where optimal ones vary with executed workloads. Among various alternatives for tuning, Bayesian optimization (BO) is a popular framework that finds near-optimal configurations given sufficient budget, but it suffers from re-optimization issue and not practical real production. When applying transfer learning accelerate...