Please use this identifier to cite or link to this item: http://ktisis.cut.ac.cy/handle/10488/6673
Title: Linear least squares regression: a different view
Authors: Yatracos, Yannis G. 
Keywords: Regression analysis
Parameter estimation
Issue Date: 1996
Publisher: Elsevier
Source: Statistics and Probability Letters, 1996, Volume 29, Issue 2, Pages 143-148
Abstract: The main result of this paper is filling an existing gap between the theory of least squares regression and the solution of linear systems of equations. A linear least squares regression problem with p-parameters over n cases is converted, via non-orthogonal transformations, into a k-parameter regression problem through the origin on n - p + k cases, and p - k equations in diagonal form with p - k unknowns, 0 < k < p. As a consequence of this result: (i) tests and confidence intervals can be easily obtained for any subset of the parameters of the model; (ii) the regression problem can be converted into p-univariate regression problems through the origin based on (n - p + 1) cases only; (iii) one may conclude that we can talk about the influence of the observations on any subset of the least squares estimates; (iv) the PC user may provide solutions to regression problems of higher dimension than the ones previously handled.
URI: http://ktisis.cut.ac.cy/handle/10488/6673
ISSN: 01677152
DOI: 10.1016/0167-7152(95)00167-0
Rights: © 1996 Elsevier B.V. All rights reserved.
Appears in Collections:Άρθρα/Articles

Show full item record

Page view(s) 50

15
checked on Apr 25, 2017

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.