Quick links

Boosted Stochastic Backpropagation for Variational Inference

Report ID:
TR-006-17
Authors:
Date:
May 24, 2017
Pages:
53
Download Formats:
[PDF]

Abstract:

Variational inference has risen in popularity with the advent of deep generative
models due to its ecient and scalable approximation of the posterior distribution.
However, VI is not generally guaranteed to capture the true posterior.
In this paper, we propose a mixture-based non-parametric variational inference
algorithm. We prove a convergence to the true posterior in O(1=t) where t is the
number of mixture components.
Using a mixture of Gaussians as the variational approximation, we propose
boosted stochastic backpropagation where we derive tractable approximations and
practical insights to avoid numerical instability when learning a new component in
the mini-batch setting.
We then use boosted stochastic backpropagation as an unsupervised boosting
meta-algorithm for non-parametric density estimation and apply it to Variational
Autoencoders.
We empirically demonstrate the advantage of
exible and multimodal posterior
approximations in density estimation on MNIST.

Follow us: Facebook Twitter Linkedin