Abstract
We present a unified treatment of different types of one-step M-estimation in regression models which incorporates the Newton-Raphson, method of scoring and iteratively reweighted least squares forms of one-step estimator. We use higher order expansions to distinguish between the different forms of estimator and the effects of different initial estimators. We show that the Newton-Raphson form has better properties than the method of scoring form which, in turn, has better properties than the iteratively reweighted least squares form. We also show that the best choice of initial estimator is a smooth, robust estimator which converges at the rate n-1/2. These results have important consequences for the common data-analytic strategy of using a least squares analysis on "clean" data obtained by deleting observations with extreme residuals from an initial least squares fit. It is shown that the resulting estimator is an iteratively reweighted least squares one-step estimator with least squares as the initial estimator, giving it the worst performance of the one-step estimators we consider: inferences resulting from this strategy are neither valid nor robust.
Original language | English |
---|---|
Pages (from-to) | 287-310 |
Number of pages | 24 |
Journal | Journal of Statistical Planning and Inference |
Volume | 103 |
Issue number | 1-2 |
DOIs | |
Publication status | Published - 15 Apr 2002 |