Stein’s Method for Practical Machine Learning

Stein's method, due to Charles M. Stein, is a set of remarkably powerful theoretical techniques for proving approximation and limit theorems in probability theory. It has been mostly known to theoretical statisticians. Recently, however, it has been shown that some of the key ideas from Stein's method can be naturally adopted to solve computational and statistical challenges in practical machine learning. This project aims to harness Stein’s method for practical purposes, with a focus on develop new and efficient practical algorithms for learning, inference and model evaluation of highly complex probabilistic graphical models and deep learning models.

Kernelized Stein Discrepancy

Kernelized Stein discrepancy (KSD), based on combining the classical Stein discrepancy with reproducing kernel Hilbert space (RKHS), allows us to access the compatibility between empirical data and probabilistic distributions, and provides a powerful tool for developing algorithms for model evaluation (goodness-of-fit test), as well as learning and inference in general. Unlike the traditional divergence measures (such as KL, Chi-square divergence), KSD does not require to evaluate the normalization constant of the distribution, and can be applied even for the intractable, unnormalized distributions widely used in modern machine learning.

[See more details here>>].

Stein Meets Variational Inference

Based on exploiting an interesting connection between Stein discrepancy and KL divergence, we derive a new form of variational inference algorithm, called Stein variational gradient descent (SVGD), that mixes the advantages of variational inference, Monte Carlo, quasi Monte Carl and gradient descent (for MAP). SVGD provides a new powerful tool for attacking the inference and learning challenges in graphical models and probabilistic deep learning, especially when there is a need for getting diverse outputs to capture the posterior uncertainty in the Bayesian framework.

[See more details here>>].

Slides

Probabilistic Learning and Inference Using Stein's Method [slides]

A Kernelized Stein Discrepancy for Goodness-of-fit Tests and Model Evaluation [ICML2016 slides]

Papers

Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm

Liu, Wang; NIPS, 2016. [code]

A Kernelized Stein Discrepancy for Goodness-of-fit Tests and Model Evaluation

Liu, Lee, Jordan; ICML, 2016. [code: matlab, R ]

Learning to Draw Samples: With Application to Amortized MLE for Generative Adversarial Learning

Wang, Liu; preprint 2016 [code]

Black-box Importance Sampling

Liu, Lee; preprint, 2016.

Two methods for Wild Variational Inference

Liu, Feng; preprint, 2016

Informal Notes & Misc

A Short Note on Kernelized Stein Discrepancy

Liu, 2016

Stein Variational Gradient Descent: Theory and Applications

Liu, NIPS Workshop on Advances in Approximate Bayesian Inference, 2016

Learning to Sample Using Stein Discrepancy

Wang, Feng, Liu, NIPS Workshop on Bayesian Deep Learning, 2016