Information-Estimation Relationships Over Binomial and Negative Binomial Models Articles uri icon

authors

  • GIL TABORDA, CAMILO
  • GUO, DONGNING
  • PEREZ CRUZ, FERNANDO

publication date

  • May 2014

issue

  • 5

volume

  • 60

International Standard Serial Number (ISSN)

  • 0018-9448

Electronic International Standard Serial Number (EISSN)

  • 1557-9654

abstract

  • In recent years, a number of new connections between information measures and estimation have been found under various models, including, predominantly, Gaussian and Poisson models. This paper develops similar results for the binomial and negative binomial models. In particular, it is shown that the derivative of the relative entropy and the derivative of the mutual information for the binomial and negative binomial models can be expressed through the expectation of closed-form expressions that have conditional estimates as the main argument. Under mild conditions, those derivatives take the form of an expected Bregman divergence.

keywords

  • binomial model; bregman divergence; mutual information; negative binomial model; relative entropy