Remember Me
Or use your Academic/Social account:


Or use your Academic/Social account:


You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.


Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message


Verify Password:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
D. S. Poskitt (2006)
Types: Preprint
Subjects: Autoregressive approximation, fractional process, non-invertibility, rate of convergence, sieve bootstrap.
jel: jel:C22, jel:C15

Classified by OpenAIRE into

arxiv: Statistics::Theory, Statistics::Methodology, Mathematics::Number Theory
In this paper we will investigate the consequences of applying the sieve bootstrap under regularity conditions that are sufficiently general to encompass both fractionally integrated and non-invertible processes. The sieve bootstrap is obtained by approximating the data generating process by an autoregression whose order h increases with the sample size T. The sieve bootstrap may be particularly useful in the analysis of fractionally integrated processes since the statistics of interest can often be non-pivotal with distributions that depend on the fractional index d. The validity of the sieve bootstrap is established and it is shown that when the sieve bootstrap is used to approximate the distribution of a general class of statistics admitting an Edgeworth expansion then the error rate achieved is of order O ( T β+d-1 ), for any β > 0. Practical implementation of the sieve bootstrap is considered and the results are illustrated using a canonical example.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • Akaike, H. (1969). Fitting autoregressive models for prediction. Annals of Institute of Statistical Mathematics 21 243-247.
    • Andrews, D. W. K. (2004). The block-block bootstrap: Improved asymptotic refinements. Econometrica 72 673-700.
    • Baillie, R. T. (1996). Long memory processes and fractional integration in econometrics. Journal of Econometrics 73 5-59.
    • Baxter, G. (1962). An asymptotic result for the finite predictor. Math. Scand. 10 137-144.
    • Beran, J. (1992). Statistical methods for data with long-range dependence. Statistical Science 7 404-427.
    • Beran, J. (1994). Statistics for Long Memory Processes. Chapman and Hall, New York.
    • Berk, K. N. (1974). Consistent autoregressive spectral estimation. Annals of Statistics 2 489-502.
    • Bickel, P. J. and Freedman, D. A. (1981). Some asymptotic theory for the bootstrap. Annals of Statistics 9 1196-1217.
    • Bu¨hlmann, P. (1997). Sieve bootstrap for time series. Bernoulli 3 123-148.
    • Burg, J. (1968). A new analysis technique for time series data. Tech. rep., Advanced Study Institute on Signal Processing, N.A.T.O., Enschede, Netherlands.
    • Choi, E. and Hall, P. G. (2000). Bootstrap confidence regions from autoregressions of arbitrary order. Journal of the Royal Statistical Society B 62 461-477.
    • Durbin, J. (1960). The fitting of time series models. Review of International Statistical Institute 28 233-244.
    • Durbin, J. (1980). Approximations for densities of sufficient estimators. Biometrika 67 311-333.
    • Giraitis, L. and Robinson, P. M. (2003). Edgeworth expansions for semiparametric whittle estimation of long memory. Annals of Statistics 31 1325-1375.
    • Granger, C. W. J. and Joyeux, R. (1980). An introduction to long-memory time series models and fractional differencing. Journal of Time Series Analysis 1 15-29.
    • Hannan, E. J. and Deistler, M. (1988). The Statistical Theory of Linear Systems. Wiley, New York.
    • Horowitz, J. L. (2003). Bootstrap methods for markov processes. Econometrica 71 1049- 1082.
    • Hosking, J. R. M. (1980). Fractional differencing. Biometrika 68 165-176.
    • Hosking, J. R. M. (1996). Asymptotic distributions of the sample mean, autocovariances, and autocorrelations of long memory time series. Journal of Econometrics 73 261-284.
    • Kolmorgorov, A. N. (1941). Interpolation und extrapolation von stationaren zufalligen folgen. Bulletin Academy Science U. S. S. R., Mathematics Series 5 3-14.
    • Ku¨nsch, H. R. (1989). The jacknife and the bootstrap for general stationary observations. Annals of Statistics 17 1217-1241.
    • Levinson, N. (1947). The Wiener RMS (root mean square) error criterion in filter design and prediction. Journal of Mathematical Physics 25 261-278.
    • Figure 1: Probability densities for d = 0.2, T = 40, top panel, and T = 160, bottom panel: Exact (Monte-Carlo) (black), Edgeworth (blue), Approximate-Edgeworth (cyan), Normal (green), Model Bootstrap (magenta), Sieve Bootstrap (ζT∗ ) (red) Figure 2: Probability densities for d = 0.4, T = 40, top panel, and T = 160, bottom panel: Exact (Monte-Carlo) (black), Edgeworth (blue), Approximate-Edgeworth (cyan), Normal (green), Model Bootstrap (magenta), Sieve Bootstrap (ζT∗ ) (red)
  • No related research data.
  • No similar publications.

Share - Bookmark

Funded by projects

  • ARC | Fractional Integration, Pow...

Cite this article