You have just completed your registration at OpenAire.
Before you can login to the site, you will need to activate your account.
An e-mail will be sent to you with the proper instructions.
Important!
Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version
of the site upon release.
arxiv:Statistics::Theory, Statistics::Methodology, Mathematics::Number Theory
In this paper we will investigate the consequences of applying the sieve bootstrap under regularity conditions that are sufficiently general to encompass both fractionally integrated and non-invertible processes. The sieve bootstrap is obtained by approximating the data generating process by an autoregression whose order h increases with the sample size T. The sieve bootstrap may be particularly useful in the analysis of fractionally integrated processes since the statistics of interest can often be non-pivotal with distributions that depend on the fractional index d. The validity of the sieve bootstrap is established and it is shown that when the sieve bootstrap is used to approximate the distribution of a general class of statistics admitting an Edgeworth expansion then the error rate achieved is of order O ( T β+d-1 ), for any β > 0. Practical implementation of the sieve bootstrap is considered and the results are illustrated using a canonical example.
Bu¨hlmann, P. (1997). Sieve bootstrap for time series. Bernoulli 3 123-148.
Burg, J. (1968). A new analysis technique for time series data. Tech. rep., Advanced Study Institute on Signal Processing, N.A.T.O., Enschede, Netherlands.
Choi, E. and Hall, P. G. (2000). Bootstrap confidence regions from autoregressions of arbitrary order. Journal of the Royal Statistical Society B 62 461-477.
Durbin, J. (1960). The fitting of time series models. Review of International Statistical Institute 28 233-244.
Durbin, J. (1980). Approximations for densities of sufficient estimators. Biometrika 67 311-333.
Granger, C. W. J. and Joyeux, R. (1980). An introduction to long-memory time series models and fractional differencing. Journal of Time Series Analysis 1 15-29.
Hannan, E. J. and Deistler, M. (1988). The Statistical Theory of Linear Systems. Wiley, New York.
Horowitz, J. L. (2003). Bootstrap methods for markov processes. Econometrica 71 1049- 1082.
Hosking, J. R. M. (1980). Fractional differencing. Biometrika 68 165-176.
Hosking, J. R. M. (1996). Asymptotic distributions of the sample mean, autocovariances, and autocorrelations of long memory time series. Journal of Econometrics 73 261-284.
Kolmorgorov, A. N. (1941). Interpolation und extrapolation von stationaren zufalligen folgen. Bulletin Academy Science U. S. S. R., Mathematics Series 5 3-14.
Ku¨nsch, H. R. (1989). The jacknife and the bootstrap for general stationary observations. Annals of Statistics 17 1217-1241.
Levinson, N. (1947). The Wiener RMS (root mean square) error criterion in filter design and prediction. Journal of Mathematical Physics 25 261-278.
Figure 1: Probability densities for d = 0.2, T = 40, top panel, and T = 160, bottom panel: Exact (Monte-Carlo) (black), Edgeworth (blue), Approximate-Edgeworth (cyan), Normal (green), Model Bootstrap (magenta), Sieve Bootstrap (ζT∗ ) (red) Figure 2: Probability densities for d = 0.4, T = 40, top panel, and T = 160, bottom panel: Exact (Monte-Carlo) (black), Edgeworth (blue), Approximate-Edgeworth (cyan), Normal (green), Model Bootstrap (magenta), Sieve Bootstrap (ζT∗ ) (red)