Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/18537
Title: Plug-in L2-upper error bounds in deconvolution, for a mixing density estimate in Rd and for its derivatives, via the L1-error for the mixture
Authors: Yatracos, Yannis G. 
Major Field of Science: Natural Sciences
Field Category: Mathematics
Keywords: Deconvolution;Minimum distance estimation;Plug-in upper error/risk bounds;Totally positive kernels;Vapnik–Chervonenkis classes
Issue Date: 30-Jul-2019
Source: Statistics, 2019, vol. 53, no. 6, pp. 1251-1268
Volume: 53
Issue: 6
Start page: 1251
End page: 1268
Journal: Statistics 
Abstract: In deconvolution in Rd, d≥1, with mixing density p(∈P) and kernel h, the mixture density fp(∈Fp) is estimated with MDE fpˆn, having upper L1-error rate, an, in probability or in risk; pˆn∈P. In one application, P consists of L1-separable densities in R with differences changing sign at most J times and h(x−y) Totally Positive. When h is known and p is q˜-smooth, vanishing outside a compact in Rd, plug-in upper bounds are provided for the L2-error rate of pˆn and its [s]-th mixed partial derivative pˆ(s)n, via ∥∥fpˆn−fp∥∥1, with rates (loga−1n)−N1 and aN2n, respectively, for h super-smooth and smooth; q˜∈R+,[s]≤q˜,d≥1, N1>0, N2>0. For an∼(logn)ζ⋅n−δ, the former rate is optimal for any δ>0 and the latter misses the optimal by the factor (logn)ξ when δ=.5; ζ>0,ξ>0. N1 and N2 appear in optimal rates and lower error and risk bounds in the deconvolution literature.
URI: https://hdl.handle.net/20.500.14279/18537
ISSN: 10294910
DOI: 10.1080/02331888.2019.1632313
Rights: © Taylor & Francis
Attribution-NonCommercial-NoDerivs 3.0 United States
Type: Article
Affiliation : Tsinghua University 
Cyprus University of Technology 
Publication Type: Peer Reviewed
Appears in Collections:Άρθρα/Articles

CORE Recommender
Show full item record

Page view(s)

286
Last Week
1
Last month
2
checked on Nov 21, 2024

Google ScholarTM

Check

Altmetric


This item is licensed under a Creative Commons License Creative Commons