TY - JOUR
T1 - Nonparametric kernel regression subject to monotonicity constraints
AU - Hall, Peter
AU - Huang, Li Shan
PY - 2001/6
Y1 - 2001/6
N2 - We suggest a method for monotonizing general kernel-type estimators, for example local linear estimators and Nadaraya-Watson estimators. Attributes of our approach include the fact that it produces smooth estimates, indeed with the same smoothness as the unconstrained estimate. The method is applicable to a particularly wide range of estimator types, it can be trivially modified to render an estimator strictly monotone and it can be employed after the smoothing step has been implemented. Therefore, an experimenter may use his or her favorite kernel estimator, and their favorite bandwidth selector, to construct the basic nonparametric smoother and then use our technique to render it monotone in a smooth way. Implementation involves only an off-the-shelf programming routine. The method is based on maximizing fidelity to the conventional empirical approach, subject to monotonicity. We adjust the unconstrained estimator by tilting the empirical distribution so as to make the least possible change, in the sense of a distance measure, subject to imposing the constraint of monotonicity.
AB - We suggest a method for monotonizing general kernel-type estimators, for example local linear estimators and Nadaraya-Watson estimators. Attributes of our approach include the fact that it produces smooth estimates, indeed with the same smoothness as the unconstrained estimate. The method is applicable to a particularly wide range of estimator types, it can be trivially modified to render an estimator strictly monotone and it can be employed after the smoothing step has been implemented. Therefore, an experimenter may use his or her favorite kernel estimator, and their favorite bandwidth selector, to construct the basic nonparametric smoother and then use our technique to render it monotone in a smooth way. Implementation involves only an off-the-shelf programming routine. The method is based on maximizing fidelity to the conventional empirical approach, subject to monotonicity. We adjust the unconstrained estimator by tilting the empirical distribution so as to make the least possible change, in the sense of a distance measure, subject to imposing the constraint of monotonicity.
KW - Bandwidth
KW - Biased bootstrap
KW - Gasser-Müller estimator
KW - Isotonic regression
KW - Local linear estimator
KW - Nadaraya-Watson estimator
KW - Order restricted inference
KW - Power divergence
KW - Priestley-Chao estimator
KW - Weighted bootstrap
UR - http://www.scopus.com/inward/record.url?scp=0035539846&partnerID=8YFLogxK
U2 - 10.1214/aos/1009210683
DO - 10.1214/aos/1009210683
M3 - Article
SN - 0090-5364
VL - 29
SP - 624
EP - 647
JO - Annals of Statistics
JF - Annals of Statistics
IS - 3
ER -