arXiv Analytics

Sign in

arXiv:math/0511111 [math.ST]AbstractReferencesReviewsResources

Nonparametric Estimation of the Regression Function in an Errors-in-Variables Model

Fabienne Comte, Marie-Luce Taupin

Published 2005-11-04Version 1

We consider the regression model with errors-in-variables where we observe $n$ i.i.d. copies of $(Y,Z)$ satisfying $Y=f(X)+\xi, Z=X+\sigma\epsilon$, involving independent and unobserved random variables $X,\xi,\epsilon$. The density $g$ of $X$ is unknown, whereas the density of $\sigma\epsilon$ is completely known. Using the observations $(Y\_i, Z\_i)$, $i=1,...,n$, we propose an estimator of the regression function $f$, built as the ratio of two penalized minimum contrast estimators of $\ell=fg$ and $g$, without any prior knowledge on their smoothness. We prove that its $\mathbb{L}\_2$-risk on a compact set is bounded by the sum of the two $\mathbb{L}\_2(\mathbb{R})$-risks of the estimators of $\ell$ and $g$, and give the rate of convergence of such estimators for various smoothness classes for $\ell$ and $g$, when the errors $\epsilon$ are either ordinary smooth or super smooth. The resulting rate is optimal in a minimax sense in all cases where lower bounds are available.

Related articles: Most relevant | Search more
arXiv:0708.0506 [math.ST] (Published 2007-08-03)
Nonparametric estimation when data on derivatives are available
arXiv:0908.3108 [math.ST] (Published 2009-08-21)
Nonparametric estimation by convex programming
arXiv:2407.14993 [math.ST] (Published 2024-07-20)
Lower Bounds for Nonparametric Estimation of Ordinary Differential Equations