Concentration of variational approximations of (tempered) posteriors

Speaker: Pierre Alquier, Professor, ENSAE Paris Tech, Université Paris-Saclay, France.

Abstract

While Bayesian methods are extremely popular in statistics and machine learning, their application to massive datasets is often challenging. Indeed, the classical MCMC algorithms are prohibitively slow when both the model dimension and the sample size are large. Variational Bayesian methods aim at approximating the posterior by a distribution in a tractable family. Thus, MCMC are replaced by an optimization algorithm which is orders of magnitude faster. VB methods have been applied in such computationally demanding applications as including collaborative filtering, image and video processing, NLP and text processing... However, despite very nice results in practice, the theoretical properties of these approximations are usually not known. In this talk I will present recent results on the rate of concentration of variational approximations. In particular we provide conditions ensuring that the concentration rate is not deteriorated by the approximation. I will cover examples including logistic regression, matrix completion and mixture models.

Joint works with co-authors from ENSAE (Nicolas Chopin, James Ridgway, Badr-Eddine Chérieff-Abdellatif).
 

Published Aug. 13, 2018 12:44 PM - Last modified Oct. 9, 2018 9:17 AM