Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/2093
DC FieldValueLanguage
dc.contributor.authorGatu, Cristian-
dc.contributor.authorKontoghiorghes, Erricos John-
dc.contributor.otherΚοντογιώργης, Έρρικος Γιάννης-
dc.date.accessioned2013-01-30T12:39:53Zen
dc.date.accessioned2013-05-16T08:21:56Z-
dc.date.accessioned2015-12-02T09:28:37Z-
dc.date.available2013-01-30T12:39:53Zen
dc.date.available2013-05-16T08:21:56Z-
dc.date.available2015-12-02T09:28:37Z-
dc.date.issued2006-03-
dc.identifier.citationJournal of Computational and Graphical Statistics, 2006, vol. 15, no. 1, pp. 139-156en_US
dc.identifier.issn15372715-
dc.identifier.urihttps://hdl.handle.net/20.500.14279/2093-
dc.description.abstractAn efficient branch-and-bound algorithm for computing the best-subset regression models is proposed. The algorithm avoids the computation of the whole regression tree that generates all possible subset models. It is formally shown that if the branch-and-bound test holds, then the current subtree together with its right-hand side subtrees are cut. This reduces significantly the computational burden of the proposed algorithm when compared to an existing leaps-and-bounds method which generates two trees. Specifically, the proposed algorithm, which is based on orthogonal transformations, outperforms by O(n 3) the leaps-and-bounds strategy. The criteria used in identifying the best subsets are based on monotone functions of the residual sum of squares (RSS) such as R 2, adjusted R 2, mean square error of prediction, and C p. Strategies and heuristics that improve the computational performance of the proposed algorithm are investigated. A computationally efficient heuristic version of the branch-and-bound strategy which decides to cut subtrees using a tolerance parameter is proposed. The heuristic algorithm derives models close to the best ones. However, it is shown analytically that the relative error of the RSS, and consequently the corresponding statistic, of the computed subsets is smaller than the value of the tolerance parameter which lies between zero and one. Computational results and experiments on random and real data are presented and analyzed.en_US
dc.formatpdfen_US
dc.language.isoenen_US
dc.relation.ispartofJournal of Computational and Graphical Statisticsen_US
dc.rights© American Statistical Association.en_US
dc.subjectLeast squaresen_US
dc.subjectAlgorithmsen_US
dc.titleBranch-and-bound algorithms for computing the best-subset regression modelsen_US
dc.typeArticleen_US
dc.collaborationUniversity of Cyprusen_US
dc.collaborationUniversity of Londonen_US
dc.subject.categoryComputer and Information Sciencesen_US
dc.journalsSubscriptionen_US
dc.countryCyprusen_US
dc.subject.fieldNatural Sciencesen_US
dc.publicationPeer Revieweden_US
dc.identifier.doi10.1198/106186006X100290en_US
dc.dept.handle123456789/54en
dc.relation.issue1en_US
dc.relation.volume15en_US
cut.common.academicyear2006-2007en_US
dc.identifier.spage139en_US
dc.identifier.epage156en_US
item.grantfulltextnone-
item.openairecristypehttp://purl.org/coar/resource_type/c_6501-
item.fulltextNo Fulltext-
item.languageiso639-1en-
item.cerifentitytypePublications-
item.openairetypearticle-
crisitem.journal.journalissn1537-2715-
crisitem.journal.publisherTaylor & Francis-
crisitem.author.deptDepartment of Finance, Accounting and Management Science-
crisitem.author.facultyFaculty of Tourism Management, Hospitality and Entrepreneurship-
crisitem.author.orcid0000-0001-9704-9510-
crisitem.author.parentorgFaculty of Management and Economics-
Appears in Collections:Άρθρα/Articles
CORE Recommender
Show simple item record

SCOPUSTM   
Citations

55
checked on Nov 9, 2023

WEB OF SCIENCETM
Citations 50

52
Last Week
0
Last month
0
checked on Oct 29, 2023

Page view(s) 5

602
Last Week
0
Last month
6
checked on Dec 22, 2024

Google ScholarTM

Check

Altmetric


Items in KTISIS are protected by copyright, with all rights reserved, unless otherwise indicated.