Please use this identifier to cite or link to this item: http://ktisis.cut.ac.cy/handle/10488/6671
Title: Rates of convergence of estimates, Kolmogorov's entropy and the dimensionality reduction principle in regression
Authors: Nicoleris, Theodoros 
Yatracos, Yannis G. 
Nicoleris, Theodoros 
Keywords: Convergence
Estimation
Issue Date: 1997
Publisher: Project Euclid
Source: Annals of Statistics, 1997, Volume 25, Issue 6, Pages 2493-2511
Abstract: L1-optimal minimum distance estimators are provided for a projection pursuit regression type function with smooth functional components that are either additive or multiplicative, in the presence of or without interactions. The obtained rates of convergence of the estimate to the true parameter depend on Kolmogorov's entropy of the assumed model and confirm Stone's heuristic dimensionality reduction principle. Rates of convergence are also obtained for the error in estimating the derivatives of a regression type function.
URI: http://ktisis.cut.ac.cy/handle/10488/6671
ISSN: 00905364
DOI: 10.1214/aos/1030741082
Appears in Collections:Άρθρα/Articles

Files in This Item:
File Description SizeFormat 
euclid.aos.1030741082.pdf158.25 kBAdobe PDFView/Open
Show full item record

Page view(s) 50

29
Last Week
0
Last month
4
checked on Aug 22, 2017

Download(s) 50

6
checked on Aug 22, 2017

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.