Stein’s Method for Practical Machine Learning
Stein's method, due to Charles M. Stein, is a set of remarkably powerful theoretical techniques for proving approximation and limit theorems in probability theory. It has been mostly known to theoretical statisticians. Recently, however, it has been shown that some of the key ideas from Stein's method can be naturally adopted to solve computational and statistical challenges in practical machine learning. This project aims to harness Stein’s method for practical purposes, with a focus on develop new and efficient practical algorithms for learning, inference and model evaluation of highly complex probabilistic graphical models and deep learning models.
Kernelized Stein Discrepancy
Kernelized Stein discrepancy (KSD), based on combining the classical Stein discrepancy with reproducing kernel Hilbert space (RKHS), allows us to access the compatibility between empirical data and probabilistic distributions, and provides a powerful tool for developing algorithms for model evaluation (goodness-of-fit test), as well as learning and inference in general. Unlike the traditional divergence measures (such as KL, Chi-square divergence), KSD does not require to evaluate the normalization constant of the distribution, and can be applied even for the intractable, unnormalized distributions widely used in modern machine learning.
Stein Meets Variational Inference
Based on exploiting an interesting connection between Stein discrepancy and KL divergence, we derive a new form of variational inference algorithm, called Stein variational gradient descent (SVGD), that mixes the advantages of variational inference, Monte Carlo, quasi Monte Carl and gradient descent (for MAP). SVGD provides a new powerful tool for attacking the inference and learning challenges in graphical models and probabilistic deep learning, especially when there is a need for getting diverse outputs to capture the posterior uncertainty in the Bayesian framework.
Probabilistic Learning and Inference Using Stein's Method [slides]
A Kernelized Stein Discrepancy for Goodness-of-fit Tests and Model Evaluation [ICML2016 slides]
Informal Notes & Misc