Electronic International Standard Serial Number (EISSN)
1557-9654
abstract
As shown by Médard, the capacity of fading channels with imperfect channel-state information can be lower-bounded by assuming a Gaussian channel input X with power P and by upper-bounding the conditional entropy h(X|Y,H) by the entropy of a Gaussian random variable with variance equal to the linear minimum mean-square error in estimating X from \(Y, H). We demonstrate that, using a rate-splitting approach, this lower bound can be sharpened: by expressing the Gaussian input X as the sum of two independent Gaussian variables X1 and X2 and by applying Médard's lower bound first to bound the mutual information between X1 and Y while treating X2 as noise, and by applying it a second time to the mutual information between X2 and Y while assuming X1 to be known, we obtain a capacity lower bound that is strictly larger than Médard's lower bound. We then generalize this approach to an arbitrary number L of layers, where X is expressed as the sum of L independent Gaussian random variables of respective variances Pℓ, ℓ = 1, ⋯ ,L summing up to P. Among all such rate-splitting bounds, we determine the supremum over power allocations Pℓ and total number of layers L. This supremum is achieved for L ∞ and gives rise to an analytically expressible capacity lower bound. For Gaussian fading, this novel bound is shown to converge to the Gaussian-input mutual information as the signal-to-noise ratio (SNR) grows, provided that the variance of the channel estimation error H-H tends to zero as the SNR tends to infinity.
Classification
keywords
channel capacity; fading channels; flat fading; imperfect channel-state information