Exponential families are characterized by the log normalizer function F, and include the following well-known distributions:
Gaussian (generic, isotropic Gaussian, diagonal Gaussian, rectified Gaussian or Wald distributions, lognormal), Poisson, Bernoulli, binomial, multinomial, Laplacian, Gamma (incl. chi-squared), Beta, exponential, Wishart, Dirichlet, Rayleigh, probability simplex,
negative binomial distribution, Weibull, von Mises, Pareto distributions, skew logistic, etc.
All corresponding formula of the canonical decomposition are given in the documentation
Mixtures of exponential families provide a generic framework for handling Gaussian mixture models (GMMs also called MoGs for mixture of Gaussians), mixture of Poisson distributions, and Laplacian mixture models as well.
jMEF is a Java cross-platform library developped by Vincent Garcia and Frank Nielsen. jMEF allows one to:
Download jMEF (jar)
Download jMEF (sources)
Documentation
License.txt
This tutorial reports the experiment proposed by Banerjee et al. in [5]. We create three 1-dimensional datasets of 1000 sample each, based on mixture models of Gaussian, Poisson and Binomial distributions, respectively. All the mixture models had three components with means centered at 10, 20 and 40, respectively. The standard deviation s of the Gaussian densities was set to 5 and the number of trials N of the Binomial distribution was set to 100 so as to make the three models somewhat similar to each other, in the sense that the variance is approximately the same for all three models. For each dataset, we estimate the parameters of three mixture models of Gaussian, Poisson and Binomial distributions using the proposed Bregman soft clustering implementation. The quality of the clustering was measured in terms of the normalized mutual information (Strehl and Ghosh, 2002) between the predicted clusters and original clusters (based on the actual generating mixture component). The results were averaged over 100 trials. This tutorial needs an additional file (k-means).
Download tutorial Download additionnal filesThis tutorial consists in the following steps:
We then check that the estimated mixtures f_{1} and f_{2} are similar. This tutorial needs additional files.
Download tutorial Download additionnal filesThis tutorial consists in the following steps:
This tutorial needs additional files.
Download tutorial Download additionnal filesm=1 | m=2 | m=4 | m=8 | m=16 | m=32 |
This tutorial consists in the following steps:
Note that the hierachical mixture model allows to automatically extract the optimal number of components in the mixture model.
To do this, use the method getOptimalMixtureModel(t)
instead of getPrecision(m)
in the tutorial.
This tutorial needs additionnal files.
m=1 | m=2 | m=4 | m=8 | m=16 | m=32 |
For this tutorial, we consider an input image as a set of pixels in a 5-dimensional space (color information RGB + position information XY). The mixture of Gaussians f is learnt from the set of pixels using the Bregman soft clustering algorithm. Then, we create two images (see Fig.3):
The proposed tutorial shows that the image structure can be captured into a mixture of Gaussians. The image is then represented by a small set of parameters (in comparison to the number of pixels) which is well adapted to applications such as color image retrieval. Considering an input image represented by its mixture of Gaussians, it is then trivial to retrieve, in a image database, a set of images have a similar color organization. This tutorial needs additionnal files.
Download tutorial Download additional filesOriginal images |
||||
Gaussian representation |
||||
Statistical images |
Please send requests, comments or criticisms to Vincent Garcia and Frank Nielsen. If you would like to add a particular distribution to the exponential family, send us:
We give as an example the files corresponding to the multivariate Gaussian distributions:
MultivariateGaussian.java and
MultivariateGaussian.tex.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
See details about MIT license.