The Recycling Gibbs sampler for efficient learning Articles uri icon



publication date

  • March 2018

start page

  • 1

end page

  • 13


  • 74

International Standard Serial Number (ISSN)

  • 1051-2004

Electronic International Standard Serial Number (EISSN)

  • 1095-4333


  • Monte Carlo methods are essential tools for Bayesian inference. Gibbs sampling is a well-known Markov chain Monte Carlo (MCMC) algorithm, extensively used in signal processing, machine learning, and statistics, employed to draw samples from complicated high-dimensional posterior distributions. The key point for the successful application of the Gibbs sampler is the ability to draw efficiently samples from the full conditional probability density functions. Since in the general case this is not possible, in order to speed up the convergence of the chain, it is required to generate auxiliary samples whose information is eventually disregarded. In this work, we show that these auxiliary samples can be recycled within the Gibbs estimators, improving their efficiency with no extra cost. This novel scheme arises naturally after pointing out the relationship between the standard Gibbs sampler and the chain rule used for sampling purposes. Numerical simulations involving simple and real inference problems confirm the excellent performance of the proposed scheme in terms of accuracy and computational efficiency. In particular we give empirical evidence of performance in a toy example, inference of Gaussian processes hyperparameters, and learning dependence graphs through regression. (C) 2017 Elsevier Inc. All rights reserved.


  • bayesian inference; markov chain monte carlo (mcmc); gibbs sampling; gaussian processes (gp); automatic relevance determination (ard)