Convergence rates for kernel regression in infinite-dimensional spaces

Chowdhury, Joydeep ; Chaudhuri, Probal (2020) Convergence rates for kernel regression in infinite-dimensional spaces Annals of the Institute of Statistical Mathematics, 72 (2). pp. 471-509. ISSN 0020-3157

Full text not available from this repository.

Official URL: http://doi.org/10.1007/s10463-018-0697-2

Related URL: http://dx.doi.org/10.1007/s10463-018-0697-2

Abstract

We consider a nonparametric regression setup, where the covariate is a random element in a complete separable metric space, and the parameter of interest associated with the conditional distribution of the response lies in a separable Banach space. We derive the optimum convergence rate for the kernel estimate of the parameter in this setup. The small ball probability in the covariate space plays a critical role in determining the asymptotic variance of kernel estimates. Unlike the case of finite-dimensional covariates, we show that the asymptotic orders of the bias and the variance of the estimate achieving the optimum convergence rate may be different for infinite-dimensional covariates. Also, the bandwidth, which balances the bias and the variance, may lead to an estimate with suboptimal mean square error for infinite-dimensional covariates. We describe a data-driven adaptive choice of the bandwidth and derive the asymptotic behavior of the adaptive estimate.

Item Type:Article
Keywords:Adaptive estimate;Bias-variance decomposition;Gaussian process;Maximum likelihood regression;Mean square error;Optimal bandwidth;Small ball probability;t process
ID Code:130688
Deposited On:29 Nov 2022 06:35
Last Modified:29 Nov 2022 06:35

Repository Staff Only: item control page