Local Maxima in the Likelihood of Gaussian Mixture Models: Structural Results and Algorithmic Consequences

FOS: Computer and information sciences Computer Science - Machine Learning Statistics - Machine Learning Optimization and Control (math.OC) 0202 electrical engineering, electronic engineering, information engineering FOS: Mathematics Machine Learning (stat.ML) 02 engineering and technology Mathematics - Optimization and Control Machine Learning (cs.LG)
DOI: 10.48550/arxiv.1609.00978 Publication Date: 2016-01-01
ABSTRACT
We provide two fundamental results on the population (infinite-sample) likelihood function of Gaussian mixture models with $M \geq 3$ components. Our first main result shows that the population likelihood function has bad local maxima even in the special case of equally-weighted mixtures of well-separated and spherical Gaussians. We prove that the log-likelihood value of these bad local maxima can be arbitrarily worse than that of any global optimum, thereby resolving an open question of Srebro (2007). Our second main result shows that the EM algorithm (or a first-order variant of it) with random initialization will converge to bad critical points with probability at least $1-e^{-��(M)}$. We further establish that a first-order variant of EM will not converge to strict saddle points almost surely, indicating that the poor performance of the first-order method can be attributed to the existence of bad local maxima rather than bad saddle points. Overall, our results highlight the necessity of careful initialization when using the EM algorithm in practice, even when applied in highly favorable settings.<br/>Neural Information Processing Systems (NIPS) 2016<br/>
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....