|
Upper Bounds for Errors of Estimators in a Problem of Nonparametric Regression: The Adaptive Case and the Case of Unknown Measure $\rho_X$
Yu. V. Malykhin Steklov Mathematical Institute, Russian Academy of Sciences
Abstract:
We construct estimators of regression functions and prove theorems on their errors in two different cases. In the first case, we consider the so-called adaptive estimators whose error is close to the optimal one for a whole family of classes of possible regression functions; the adaptivity of the estimators is connected with the fact that they are constructed without any information about the choice of the class. In the second case, the class of possible regression functions is fixed; however, the marginal measure is unknown and the estimator is constructed without any information about this measure. Its error turns out to be close to the minimal possible (in the worst case) error.
Keywords:
nonparametric regression, regression function, adaptive estimator, marginal measure, Bernstein's inequality, combinatorial dimension, least-squares method.
Received: 20.11.2008 Revised: 28.02.2009
Citation:
Yu. V. Malykhin, “Upper Bounds for Errors of Estimators in a Problem of Nonparametric Regression: The Adaptive Case and the Case of Unknown Measure $\rho_X$”, Mat. Zametki, 86:5 (2009), 725–732; Math. Notes, 86:5 (2009), 682–689
Linking options:
https://www.mathnet.ru/eng/mzm6571https://doi.org/10.4213/mzm6571 https://www.mathnet.ru/eng/mzm/v86/i5/p725
|
Statistics & downloads: |
Abstract page: | 538 | Full-text PDF : | 169 | References: | 79 | First page: | 5 |
|