Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/1142
Title: Rates of convergence of estimates, Kolmogorov's entropy and the dimensionality reduction principle in regression
Authors: Nicoleris, Theodoros 
Yatracos, Yannis G. 
Nicoleris, Theodoros 
metadata.dc.contributor.other: Γιατράκος, Γιάννης
Major Field of Science: Natural Sciences
Field Category: Mathematics
Keywords: Convergence;Estimation
Issue Date: Dec-1997
Source: Annals of Statistics, 1997, vol. 25, no. 6, pp. 2493-2511
Volume: 25
Issue: 6
Start page: 2493
End page: 2511
Journal: Annals of Statistics 
Abstract: L1-optimal minimum distance estimators are provided for a projection pursuit regression type function with smooth functional components that are either additive or multiplicative, in the presence of or without interactions. The obtained rates of convergence of the estimate to the true parameter depend on Kolmogorov's entropy of the assumed model and confirm Stone's heuristic dimensionality reduction principle. Rates of convergence are also obtained for the error in estimating the derivatives of a regression type function.
URI: https://hdl.handle.net/20.500.14279/1142
ISSN: 905364
DOI: 10.1214/aos/1030741082
Rights: © Institute of Mathematical Statistics
Type: Article
Affiliation : Université de Montréal 
Appears in Collections:Άρθρα/Articles

Files in This Item:
File Description SizeFormat
euclid.aos.1030741082.pdf158.25 kBAdobe PDFView/Open
CORE Recommender
Show full item record

SCOPUSTM   
Citations

14
checked on Nov 9, 2023

Page view(s) 10

582
Last Week
0
Last month
2
checked on Dec 22, 2024

Download(s)

496
checked on Dec 22, 2024

Google ScholarTM

Check

Altmetric


Items in KTISIS are protected by copyright, with all rights reserved, unless otherwise indicated.