Subsampling inference for the autocovariances and autocorrelations of long-memory heavy- tailed linear time series Articles
Overview
published in
- JOURNAL OF TIME SERIES ANALYSIS Journal
publication date
- November 2012
start page
- 935
end page
- 953
volume
- 33
Digital Object Identifier (DOI)
International Standard Serial Number (ISSN)
- 0143-9782
Electronic International Standard Serial Number (EISSN)
- 1467-9892
abstract
- We provide a self-normalization for the sample autocovariances and autocorrelations of a linear, long-memory time series with innovations that have either finite fourth moment or are heavy-tailed with tail index 2 < a < 4. In the asymptotic distribution of the sample autocovariance there are three rates of convergence that depend on the interplay between the memory parameter d and a, and which consequently lead to three different limit distributions; for the sample autocorrelation the limit distribution only depends on d. We introduce a self-normalized sample autocovariance statistic, which is computable without knowledge of a or d (or their relationship), and which converges to a non-degenerate distribution. We also treat self-normalization of the autocorrelations. The sampling distributions can then be approximated non-parametrically by subsampling, as the corresponding asymptotic distribution is still parameter-dependent. The subsampling-based confidence intervals for the process autocovariances and autocorrelations are shown to have satisfactory empirical coverage rates in a simulation study. The impact of subsampling block size on the coverage is assessed. The methodology is further applied to the log-squared returns of Merck stock.