Bounded generalized Gaussian mixture model

Generalized normal distribution Gaussian network model Laplace's method
DOI: 10.1016/j.patcog.2014.03.030 Publication Date: 2014-04-04T02:19:10Z
ABSTRACT
Abstract The generalized Gaussian mixture model (GGMM) provides a flexible and suitable tool for many computer vision and pattern recognition problems. However, generalized Gaussian distribution is unbounded. In many applications, the observed data are digitalized and have bounded support. A new bounded generalized Gaussian mixture model (BGGMM), which includes the Gaussian mixture model (GMM), Laplace mixture model (LMM), and GGMM as special cases, is presented in this paper. We propose an extension of the generalized Gaussian distribution in this paper. This new distribution has a flexibility to fit different shapes of observed data such as non-Gaussian and bounded support data. In order to estimate the model parameters, we propose an alternate approach to minimize the higher bound on the data negative log-likelihood function. We quantify the performance of the BGGMM with simulations and real data.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (28)
CITATIONS (85)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....