TY - JOUR
T1 - Prediction in functional linear regression
AU - Tony Cai, T.
AU - Hall, Peter
PY - 2006/10
Y1 - 2006/10
N2 - There has been substantial recent work on methods for estimating the slope function in linear regression for functional data analysis. However, as in the case of more conventional finite-dimensional regression, much of the practical interest in the slope centers on its application for the purpose of prediction, rather than on its significance in its own right. We show that the problems of slope-function estimation, and of prediction from an estimator of the slope function, have very different characteristics. While the former is intrinsically nonparametric, the latter can be either nonparametric or semi-parametric. In particular, the optimal mean-square convergence rate of predictors is n -1, where n denotes sample size, if the predictand is a sufficiently smooth function. In other cases, convergence occurs at a polynomial rate that is strictly slower than n-1. At the boundary between these two regimes, the mean-square convergence rate is less than n-1 by only a logarithmic factor. More generally, the rate of convergence of the predicted value of the mean response in the regression model, given a particular value of the explanatory variable, is determined by a subtle interaction among the smoothness of the predictand, of the slope function in the model, and of the autocovariance function for the distribution of explanatory variables.
AB - There has been substantial recent work on methods for estimating the slope function in linear regression for functional data analysis. However, as in the case of more conventional finite-dimensional regression, much of the practical interest in the slope centers on its application for the purpose of prediction, rather than on its significance in its own right. We show that the problems of slope-function estimation, and of prediction from an estimator of the slope function, have very different characteristics. While the former is intrinsically nonparametric, the latter can be either nonparametric or semi-parametric. In particular, the optimal mean-square convergence rate of predictors is n -1, where n denotes sample size, if the predictand is a sufficiently smooth function. In other cases, convergence occurs at a polynomial rate that is strictly slower than n-1. At the boundary between these two regimes, the mean-square convergence rate is less than n-1 by only a logarithmic factor. More generally, the rate of convergence of the predicted value of the mean response in the regression model, given a particular value of the explanatory variable, is determined by a subtle interaction among the smoothness of the predictand, of the slope function in the model, and of the autocovariance function for the distribution of explanatory variables.
KW - Bootstrap
KW - Covariance
KW - Dimension reduction
KW - Eigenfunction
KW - Eigenvalue
KW - Eigenvector
KW - Functional data analysis
KW - Intercept
KW - Minimax
KW - Optimal convergence rate
KW - Principal components analysis
KW - Rate of convergence
KW - Slope
KW - Smoothing
KW - Spectral decomposition
UR - http://www.scopus.com/inward/record.url?scp=33847349358&partnerID=8YFLogxK
U2 - 10.1214/009053606000000830
DO - 10.1214/009053606000000830
M3 - Article
SN - 0090-5364
VL - 34
SP - 2159
EP - 2179
JO - Annals of Statistics
JF - Annals of Statistics
IS - 5
ER -