Optimal Transport Kernels for Sequential and Parallel Neural Architecture Search

FOS: Computer and information sciences Computer Science - Machine Learning Statistics - Machine Learning 0202 electrical engineering, electronic engineering, information engineering Computer Science - Neural and Evolutionary Computing Machine Learning (stat.ML) 02 engineering and technology Neural and Evolutionary Computing (cs.NE) Machine Learning (cs.LG)
DOI: 10.48550/arxiv.2006.07593 Publication Date: 2020-01-01
ABSTRACT
23 pages, camera ready ICML2021<br/>Neural architecture search (NAS) automates the design of deep neural networks. One of the main challenges in searching complex and non-continuous architectures is to compare the similarity of networks that the conventional Euclidean metric may fail to capture. Optimal transport (OT) is resilient to such complex structure by considering the minimal cost for transporting a network into another. However, the OT is generally not negative definite which may limit its ability to build the positive-definite kernels required in many kernel-dependent frameworks. Building upon tree-Wasserstein (TW), which is a negative definite variant of OT, we develop a novel discrepancy for neural architectures, and demonstrate it within a Gaussian process surrogate model for the sequential NAS settings. Furthermore, we derive a novel parallel NAS, using quality k-determinantal point process on the GP posterior, to select diverse and high-performing architectures from a discrete set of candidates. Empirically, we demonstrate that our TW-based approaches outperform other baselines in both sequential and parallel NAS.<br/>
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....