On the performance of particle filters with adaptive number of particles Articles uri icon

publication date

  • October 2021

start page

  • 1

end page

  • 18

issue

  • 6

volume

  • 31

International Standard Serial Number (ISSN)

  • 0960-3174

Electronic International Standard Serial Number (EISSN)

  • 1573-1375

abstract

  • We investigate the performance of a class of particle filters (PFs) that can automatically tune their computational complexity
    by evaluating online certain predictive statistics which are invariant for a broad class of state-space models. To be specific, we
    propose a family of block-adaptive PFs based on the methodology of Elvira et al. (IEEE Trans Signal Process 65(7):1781–
    1794, 2017). In this class of algorithms, the number of Monte Carlo samples (known as particles) is adjusted periodically, and
    we prove that the theoretical error bounds of the PF actually adapt to the updates in the number of particles. The evaluation
    of the predictive statistics that lies at the core of the methodology is done by generating fictitious observations, i.e., particles
    in the observation space. We study, both analytically and numerically, the impact of the number K of these particles on the
    performance of the algorithm. In particular, we prove that if the predictive statistics with K fictitious observations converged
    exactly, then the particle approximation of the filtering distribution would match the first K elements in a series of moments
    of the true filter. This result can be understood as a converse to some convergence theorems for PFs. From this analysis, we
    deduce an alternative predictive statistic that can be computed (for some models) without sampling any fictitious observations
    at all. Finally, we conduct an extensive simulation study that illustrates the theoretical results and provides further insights
    into the complexity, performance and behavior of the new class of algorithms.

subjects

  • Industrial Engineering

keywords

  • particle filtering; sequential monte carlo; predictive distributions; convergence analysis; adaptive complexity