Theory for the Jackknife
This chapter presents theory for the jackknife in the case where the data are i.i.d. Many results can be extended in a straightforward manner to more complicated cases, which will be studied in later chapters. We begin this chapter by first focusing on jackknife variance estimators. The basic theoretical consideration in using jackknife variance estimators is their consistency, which is especially crucial when the jackknife variance estimators are used in large sample statistical inference problems such as constructing confidence sets for some unknown parameters. A complete theory for the consistency of the jackknife variance estimators is given in Sections 2.1 and 2.2. We will show that the success of the jackknife variance estimator for a given statistic T n relies on the smoothness of T n , which can be characterized by the differentiability of the function that generates Tn. The jackknife variance estimator may be inconsistent for a statistic that is not very smooth; however, its inconsistency can be rectified by using the delete-d jackknife, an extended version of the jackknife that removes more than one datum at a time (Section 2.3). The delete-d jackknife also provides a jackknife estimator of the sampling distribution of T n , known as the jackknife histogram. Other applications of the jackknife, such as bias estimation and bias reduction, are discussed in Section 2.4. Some empirical results are given as examples.
KeywordsVariance Estimation Variance Estimator Simple Random Sample Asymptotic Variance Influence Function
Unable to display preview. Download preview PDF.