Adaptive Importance Sampling The past, the present, and the future Articles uri icon

authors

  • BUGALLO, MÓNICA F.
  • ELVIRA ARREGUI, VICTOR
  • MARTINO, LUCA
  • LUENGO GARCIA, DAVID
  • MIGUEZ ARENAS, JOAQUIN
  • DJURIC, PETAR M

publication date

  • July 2017

start page

  • 60

end page

  • 79

issue

  • 4

volume

  • 34

International Standard Serial Number (ISSN)

  • 1053-5888

Electronic International Standard Serial Number (EISSN)

  • 1558-0792

abstract

  • A fundamental problem in signal processing is the estimation of unknown parameters or functions from noisy observations. Important examples include localization of objects in wireless sensor networks [1] and the Internet of Things [2]; multiple source reconstruction from electroencephalograms [3]; estimation of power spectral density for speech enhancement [4]; or inference in genomic signal processing [5]. Within the Bayesian signal processing framework, these problems are addressed by constructing posterior probability distributions of the unknowns. The posteriors combine optimally all of the information about the unknowns in the observations with the information that is present in their prior probability distributions. Given the posterior, one often wants to make inference about the unknowns, e.g., if we are estimating parameters, finding the values that maximize their posterior or the values that minimize some cost function given the uncertainty of the parameters. Unfortunately, obtaining closed-form solutions to these types of problems is infeasible in most practical applications, and therefore, developing approximate inference techniques is of utmost interest.

subjects

  • Telecommunications

keywords

  • sampling methods; artificial intelligence; bayes methods; signal processing algorithms; probability distribution; monte carlo methods; signal processing