Skip to content

GSoC 2015 Proposal: Improve GMM module

Wei Xue edited this page Mar 27, 2015 · 13 revisions

Project Proposal Information

Proposal Title scikit-learn: Improve GMM module

Abstract

Current implementation of Gaussian mixture model (GMM), variational Bayesian Gaussian mixture model (VBGMM) and Dirichlet process Gaussian mixture model (DPGMM) in scikit-learn have many problems. They suffer from interface incompatibility, incorrect model training and incompleteness of testing. In this project, I propose to clean up APIs of GMM, re-implement the updating functions of VBGMM and DPGMM based on well-acknowledged literature, and pass various test cases.

This proposal is organized as follows. Model Introduction section gives a brief overview of Gaussian mixture model, variational Bayesian Gaussian mixture model and Dirichlet process Gaussian mixture model. Next, Motivation section states the existing problems in the scikit-learn. Plan section outlines the details solving these problems in my GSoC participation.

Model Introduction

Gaussian mixture model (GMM) is a popular unsupervised clustering method. GMM corresponds a linear combination of several Gaussian distributions to represent the probability distribution of observations. In GMM, with the prefix number of Gaussian component, a set of parameters should be estimated to represent the distribution of the training data. It includes means, covariances and the coefficients of the linear combination. Expectation Maximization (EM) is usually used to find the maximum likelihood parameters of the mixture model. In each iteration, E-step estimates the conditional distribution of the latent variables. M-step finds the model parameters that maximize the likelihood.

In variational Bayesian Gaussian mixture model (VBGMM), M-step is generalized into full Bayesian estimation, where the parameters are represented by the posterior distribution, instead of only single value like in maximum-likelihood estimation [14, 15].

On the other hand, Dirichlet process Gaussian mixture model (DPGMM) allows a mixture of infinite Gaussian distributions. It uses Dirichlet process as a nonparametric prior of the distribution parameters, and the number of components could vary according to the data. Therefore, one does not have to preset the number of components ahead of time. The simplest way to infer DPGMM is Monte-Carlo Markov chain (MCMC), but it is generally slow to converge. In Blei's paper [13], truncated variational inference is proposed, which converges faster than MCMC.

Motivation

GMM, VBGMM and DPGMM are very popular in various machine learning applications [21]. However, in scikit-learn, the implementation suffers from so many problems, which prohibits the widely use of these models. Every year since 2012 Improve GMM topic is in scikit-learn's GSoC To-do list.

Plan

To save GMM, VBGMM and DPGMM modules from the danger of decay, I present the following detailed plan in this GSoC program.

Documentation

The derivations of the updating rules of three modules are critical for implementation. The textbook PRML [14] gives detailed updating equations for Gaussian mixture model and variational inference, except some derivation steps for updating functions are skipped. Blei's paper [13] gives the updating functions of variational inference for Dirichlet process mixtures with exponential distributions. Gaussian distribution is one of exponential distributions, but the specific updating equations should be derived. Existing documents concerning derivation for VBGMM DPGMM are not consistent with research literature. As far as I know, it has not been thoroughly investigated since the initial implementation (PR 116 Variational infinite gmm)[22]. It might be the cause of incorrect algorithm output. I will document the updating equations for all models in detail. The goal of this part is to guarantee derivations correct, clean and easy to understand.

Other documentation such as docstrings and examples will also be updated with the convention of scikit-learn. Several IPython Notebooks will also demonstrate how to use these three models with intuitive pictures.

Design Interface

The API inconsistency needs to be resolved at the very first place for all of three classes. The discussion in Issue 2473, Issue 3813, Issue 4062 and Issue 4429 shed light on new interface. In this stage, I will discuss this issue with group, and nail down a clean and consistent interface shared by GMM, VBGMM and DPGMM. KDE, another density estimator, should also be taken into account.

Refurbish GMM

Based on the discussion of previous issues, the main problem in GMM lies in the interface. This part probably consists the following possible prospective sub-tasks.

  • Rename GMM to GaussianMixture in order to make it consistent with naming convention
  • Implement DensityEstimatorMixin which supports density method and score method. density returns the PDF of input X. score returns the sum of log likelihood of input X.
  • Add method responsibility_sample which returns the responsibility of every sample
  • Implement friendly interface to initialize parameters of GMM
  • Add new parameter estimation method, maximum a posterior (MAP) [Optional] [15]

Re-implement updating methods in VBGMM and DPGMM

The problems reported about VBGMM and DPGMM are the tip of an iceberg. The current evidence is pointing to the nonstandard updating equations. There would be some uncertainty on this part, because this stage depends on the mathematical derivation and intensive testing. The sub-tasks are

  • Reshape the interface
  • Check the parameter updating methods against the settled mathematical derivations. Find out why existing implementation fails on simple cases
  • Fix the emerged bugs and refactor problematic methods
  • Better initialization strategy [Optional] [15]

Test

As Issue 2454 pointed out, there are some potential bugs in the current implementation. It cannot pass toy tests. Moreover, the test cases have not even been done for VBGMM and DPGMM. In this part, each feature will be accompanied by a comprehensive set of testing. The test data sets include, for example, random generated toy examples, Iris data set, Old Faithful data set. The testing part consists of the following several parts.

  • Discuss the reasonable testing methods for variational inference, since it is difficult to test the correctness of nondeterministic algorithm like MCMC.
  • Unit test with the updating functions in E-step and M-step
  • Integration test with numerical stability [22] and the convergence in terms of log-likelihood. There are some recent discussion about the problems of variational inference in NIPS workshop [NIPS 2014 Workshop]. I will do some research on these.
  • Benchmark with regards to n_samples, n_features, n_components, n_iterations, tolerance and four covariance type.
    • Plot the likelihood (or lower-bound) against some of the above variables
    • Use the model as a classifier to test accuracy, recall and f1-score
  • Do profiling on memory usage and runtime to find out the bottleneck. Improve and compare it against current implementation in terms of performance.

Time line

  • Week 1, 2
    • Derivate the updating rules of models
    • Discuss on the testing methods for variational inference
    • Discuss on the consistent interface
  • Week 3, 4
    • Clean the API of GMM and fix bugs
    • Improve documents and examples for GMM
    • Test and Documentation
  • Week 5, 6, 7, 8
    • Debug and implement VBGMM
    • Test and Documentation
    • Pass the midterm
  • Week 9, 10
    • Debug and implement DPGMM
    • Test and Documentation
  • Week 11, 12
    • Add more test cases, improve documents and fix bugs
  • Week 13
    • Pencils down
    • Wrap up

References

  1. Issue 393 DPGMM and VBGMM clustering weights not updating)
  2. Issue 1528 Consistency in GMM, _get_covars
  3. Issue 1637 dpgmm sample not working
  4. Issue 1764 DPGMM - _update_concentrations fail implementation
  5. Issue 2473 Bug: GMM score() returns an array, not a value
  6. Issue 2454 Scaling kills DPGMM
  7. Issue 3813 log-responsibilities in GMM
  8. Issue 4062 KernelDensity and GMM interfaces are unnecessarily confusing
  9. Issue 4202 GMM and score_samples(X) back to probabilities
  10. Issue 4267 Density doesn't normalize in VBGMM and DPGMM
  11. Issue 4429 incorrect estimated means lead to non positive definite covariance in GMM
  12. VBGMM DPGMM derivation in scikit-learn
  13. Blei, David M., and Michael I. Jordan. "Variational inference for Dirichlet process mixtures." Bayesian analysis 1.1 (2006): 121-143.
  14. Bishop, Christopher M. Pattern recognition and machine learning. Vol. 4. No. 4. New York: springer, 2006.
  15. Murphy, Kevin P. Machine learning: a probabilistic perspective. MIT press, 2012.
  16. Hastie, Trevor, et al. The elements of statistical learning. Vol. 2. No. 1. New York: springer, 2009.
  17. mclust. An R package for normal mixture modeling via EM, model-based clustering, classification, and density estimation.
  18. Matlab gmdistribution
  19. BayesPy (A Python implementation)
  20. Variational Dirichlet Process Gaussian Mixture Model (A Matlab implementation)
  21. Wikipedia Mixture Model
  22. PR 116 Variational infinite gmm (The initial PR of VBGMM and DPGMM)
  23. NIPS 2014 Workshop

Past Work

Clone this wiki locally