Cart
Free Shipping in Australia
Proud to be B-Corp

The Jackknife and Bootstrap Jun Shao

The Jackknife and Bootstrap By Jun Shao

The Jackknife and Bootstrap by Jun Shao


$567.79
Condition - New
Only 2 left

Summary

The resampling methods replace theoreti cal derivations required in applying traditional methods (such as substitu tion and linearization) in statistical analysis by repeatedly resampling the original data and making inferences from the resamples.

The Jackknife and Bootstrap Summary

The Jackknife and Bootstrap by Jun Shao

The jackknife and bootstrap are the most popular data-resampling meth ods used in statistical analysis. The resampling methods replace theoreti cal derivations required in applying traditional methods (such as substitu tion and linearization) in statistical analysis by repeatedly resampling the original data and making inferences from the resamples. Because of the availability of inexpensive and fast computing, these computer-intensive methods have caught on very rapidly in recent years and are particularly appreciated by applied statisticians. The primary aims of this book are (1) to provide a systematic introduction to the theory of the jackknife, the bootstrap, and other resampling methods developed in the last twenty years; (2) to provide a guide for applied statisticians: practitioners often use (or misuse) the resampling methods in situations where no theoretical confirmation has been made; and (3) to stimulate the use of the jackknife and bootstrap and further devel opments of the resampling methods. The theoretical properties of the jackknife and bootstrap methods are studied in this book in an asymptotic framework. Theorems are illustrated by examples. Finite sample properties of the jackknife and bootstrap are mostly investigated by examples and/or empirical simulation studies. In addition to the theory for the jackknife and bootstrap methods in problems with independent and identically distributed (Li.d.) data, we try to cover, as much as we can, the applications of the jackknife and bootstrap in various complicated non-Li.d. data problems.

Table of Contents

1. Introduction.- 1.1 Statistics and Their Sampling Distributions.- 1.2 The Traditional Approach.- 1.3 The Jackknife.- 1.4 The Bootstrap.- 1.5 Extensions to Complex Problems.- 1.6 Scope of Our Studies.- 2. Theory for the Jackknife.- 2.1 Variance Estimation for Functions of Means.- 2.1.1 Consistency.- 2.1.2 Other properties.- 2.1.3 Discussions and examples.- 2.2 Variance Estimation for Functionals.- 2.2.1 Differentiability and consistency.- 2.2.2 Examples.- 2.2.3 Convergence rate.- 2.2.4 Other differential approaches.- 2.3 The Delete-d Jackknife.- 2.3.1 Variance estimation.- 2.3.2 Jackknife histograms.- 2.4 Other Applications.- 2.4.1 Bias estimation.- 2.4.2 Bias reduction.- 2.4.3 Miscellaneous results.- 2.5 Conclusions and Discussions.- 3. Theory for the Bootstrap.- 3.1 Techniques in Proving Consistency.- 3.1.1 Bootstrap distribution estimators.- 3.1.2 Mallows' distance.- 3.1.3 Berry-Esseen's inequality.- 3.1.4 Imitation.- 3.1.5 Linearization.- 3.1.6 Convergence in moments.- 3.2 Consistency: Some Major Results.- 3.2.1 Distribution estimators.- 3.2.2 Variance estimators.- 3.3 Accuracy and Asymptotic Comparisons.- 3.3.1 Convergence rate.- 3.3.2 Asymptotic minimaxity.- 3.3.3 Asymptotic mean squared error.- 3.3.4 Asymptotic relative error.- 3.3.5 Conclusions.- 3.4 Fixed Sample Performance.- 3.4.1 Moment estimators.- 3.4.2 Distribution estimators.- 3.4.3 Conclusions.- 3.5 Smoothed Bootstrap.- 3.5.1 Empirical evidences and examples.- 3.5.2 Sample quantiles.- 3.5.3 Remarks.- 3.6 Nonregular Cases.- 3.7 Conclusions and Discussions.- 4. Bootstrap Confidence Sets and Hypothesis Tests.- 4.1 Bootstrap Confidence Sets.- 4.1.1 The bootstrap-t.- 4.1.2 The bootstrap percentile.- 4.1.3 The bootstrap bias-corrected percentile.- 4.1.4 The bootstrap accelerated bias-corrected percentile.- 4.1.5 The hybrid bootstrap.- 4.2 Asymptotic Theory.- 4.2.1 Consistency.- 4.2.2 Accuracy.- 4.2.3 Other asymptotic comparisons.- 4.3 The Iterative Bootstrap and Other Methods.- 4.3.1 The iterative bootstrap.- 4.3.2 Bootstrap calibrating.- 4.3.3 The automatic percentile and variance stabilizing.- 4.3.4 Fixed width bootstrap confidence intervals.- 4.3.5 Likelihood based bootstrap confidence sets.- 4.4 Empirical Comparisons.- 4.4.1 The bootstrap-t, percentile, BC, and BCa.- 4.4.2 The bootstrap and other asymptotic methods.- 4.4.3 The iterative bootstrap and bootstrap calibration.- 4.4.4 Summary.- 4.5 Bootstrap Hypothesis Tests.- 4.5.1 General description.- 4.5.2 Two-sided hypotheses with nuisance parameters.- 4.5.3 Bootstrap distance tests.- 4.5.4 Other results and discussions.- 4.6 Conclusions and Discussions.- 5. Computational Methods.- 5.1 The Delete-1 Jackknife.- 5.1.1 The one-step jackknife.- 5.1.2 Grouping and random subsampling.- 5.2 The Delete-d Jackknife.- 5.2.1 Balanced subsampling.- 5.2.2 Random subsampling.- 5.3 Analytic Approaches for the Bootstrap.- 5.3.1 The delta method.- 5.3.2 Jackknife approximations.- 5.3.3 Saddle point approximations.- 5.3.4 Remarks.- 5.4 Simulation Approaches for the Bootstrap.- 5.4.1 The simple Monte Carlo method.- 5.4.2 Balanced bootstrap resampling.- 5.4.3 Centering after Monte Carlo.- 5.4.4 The linear bootstrap.- 5.4.5 Antithetic bootstrap resampling.- 5.4.6 Importance bootstrap resampling.- 5.4.7 The one-step bootstrap.- 5.5 Conclusions and Discussions.- 6. Applications to Sample Surveys.- 6.1 Sampling Designs and Estimates.- 6.2 Resampling Methods.- 6.2.1 The jackknife.- 6.2.2 The balanced repeated replication.- 6.2.3 Approximated BRR methods.- 6.2.4 The bootstrap.- 6.3 Comparisons by Simulation.- 6.4 Asymptotic Results.- 6.4.1 Assumptions.- 6.4.2 The jackknife and BRR for functions of averages.- 6.4.3 The RGBRR and RSBRR for functions of averages.- 6.4.4 The bootstrap for functions of averages.- 6.4.5 The BRR and bootstrap for sample quantiles.- 6.5 Resampling Under Imputation.- 6.5.1 Hot deck imputation.- 6.5.2 An adjusted jackknife.- 6.5.3 Multiple bootstrap hot deck imputation.- 6.5.4 Bootstrapping under imputation.- 6.6 Conclusions and Discussions.- 7. Applications to Linear Models.- 7.1 Linear Models and Regression Estimates.- 7.2 Variance and Bias Estimation.- 7.2.1 Weighted and unweighted jackknives.- 7.2.2 Three types of bootstraps.- 7.2.3 Robustness and efficiency.- 7.3 Inference and Prediction Using the Bootstrap.- 7.3.1 Confidence sets.- 7.3.2 Simultaneous confidence intervals.- 7.3.3 Hypothesis tests.- 7.3.4 Prediction.- 7.4 Model Selection.- 7.4.1 Cross-validation.- 7.4.2 The bootstrap.- 7.5 Asymptotic Theory.- 7.5.1 Variance estimators.- 7.5.2 Bias estimators.- 7.5.3 Bootstrap distribution estimators.- 7.5.4 Inference and prediction.- 7.5.5 Model selection.- 7.6 Conclusions and Discussions.- 8. Applications to Nonlinear, Nonparametric, and Multivariate Models.- 8.1 Nonlinear Regression.- 8.1.1 Jackknife variance estimators.- 8.1.2 Bootstrap distributions and confidence sets.- 8.1.3 Cross-validation for model selection.- 8.2 Generalized Linear Models.- 8.2.1 Jackknife variance estimators.- 8.2.2 Bootstrap procedures.- 8.2.3 Model selection by bootstrapping.- 8.3 Cox's Regression Models.- 8.3.1 Jackknife variance estimators.- 8.3.2 Bootstrap procedures.- 8.4 Kernel Density Estimation.- 8.4.1 Bandwidth selection by cross-validation.- 8.4.2 Bandwidth selection by bootstrapping.- 8.4.3 Bootstrap confidence sets.- 8.5 Nonparametric Regression.- 8.5.1 Kernel estimates for fixed design.- 8.5.2 Kernel estimates for random regressor.- 8.5.3 Nearest neighbor estimates.- 8.5.4 Smoothing splines.- 8.6 Multivariate Analysis.- 8.6.1 Analysis of covariance matrix.- 8.6.2 Multivariate linear models.- 8.6.3 Discriminant analysis.- 8.6.4 Factor analysis and clustering.- 8.7 Conclusions and Discussions.- 9. Applications to Time Series and Other Dependent Data.- 9.1 m-Dependent Data.- 9.2 Markov Chains.- 9.3 Autoregressive Time Series.- 9.3.1 Bootstrapping residuals.- 9.3.2 Model selection.- 9.4 Other Time Series.- 9.4.1 ARMA(p,q) models.- 9.4.2 Linear regression with time series errors.- 9.4.3 Dynamical linear regression.- 9.5 Stationary Processes.- 9.5.1 Moving block and circular block.- 9.5.2 Consistency of the bootstrap.- 9.5.3 Accuracy of the bootstrap.- 9.5.4 Remarks.- 9.6 Conclusions and Discussions.- 10. Bayesian Bootstrap and Random Weighting.- 10.1 Bayesian Bootstrap.- 10.1.1 Bayesian bootstrap with a noninformative prior.- 10.1.2 Bayesian bootstrap using prior information.- 10.1.3 The weighted likelihood bootstrap.- 10.1.4 Some remarks.- 10.2 Random Weighting.- 10.2.1 Motivation.- 10.2.2 Consistency.- 10.2.3 Asymptotic accuracy.- 10.3 Random Weighting for Functional and Linear Models.- 10.3.1 Statistical functionals.- 10.3.2 Linear models.- 10.4 Empirical Results for Random Weighting.- 10.5 Conclusions and Discussions.- Appendix A. Asymptotic Results.- A.1 Modes of Convergence.- A.2 Convergence of Transformations.- A.4 The Borel-Cantelli Lemma.- A.5 The Law of Large Numbers.- A.6 The Law of the Iterated Logarithm.- A.7 Uniform Integrability.- A.8 The Central Limit Theorem.- A.9 The Berry-Esseen Theorem.- A.10 Edgeworth Expansions.- A.11 Cornish-Fisher Expansions.- Appendix B. Notation.- References.- Author Index.

Additional information

NLS9781461269038
9781461269038
1461269032
The Jackknife and Bootstrap by Jun Shao
New
Paperback
Springer-Verlag New York Inc.
2012-10-04
517
N/A
Book picture is for illustrative purposes only, actual binding, cover or edition may vary.
This is a new book - be the first to read this copy. With untouched pages and a perfect binding, your brand new copy is ready to be opened for the first time

Customer Reviews - The Jackknife and Bootstrap