Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14279/1857
DC FieldValueLanguage
dc.contributor.authorGeorgiopoulos, Michael N.-
dc.contributor.authorBebis, George N.-
dc.contributor.authorKasparis, Takis-
dc.contributor.otherΚασπαρής, Τάκης-
dc.date.accessioned2013-02-18T13:15:04Zen
dc.date.accessioned2013-05-17T05:21:59Z-
dc.date.accessioned2015-12-02T09:50:25Z-
dc.date.available2013-02-18T13:15:04Zen
dc.date.available2013-05-17T05:21:59Z-
dc.date.available2015-12-02T09:50:25Z-
dc.date.issued1997-11-
dc.identifier.citationNeurocomputing, 1997, vol. 17, no. 3-4, pp. 167-194en_US
dc.identifier.issn09252312-
dc.identifier.urihttps://hdl.handle.net/20.500.14279/1857-
dc.description.abstractRecent theoretical results support that decreasing the number of free parameters in a neural network (i.e., weights) can improve generalization. These results have triggered the development of many approaches which try to determine an 'appropriate' network size for a given problem. The main goal has been to find a network size just large enough to capture the general class properties of the data. In some cases, however, network size is not reduced significantly or the reduction is satisfactory but generalization is affected. In this paper, we propose the coupling of genetic algorithms with weight elimination. Our objective is not only to significantly reduce network size, by pruning larger size networks, but also to preserve generalization, that is, to come up with pruned networks which generalize as good or even better than their unpruned counterparts. The innovation of our work relies on a fitness function which uses an adaptive parameter to encourage reproduction of networks having small size and good generalization. The proposed approach has been tested using both artificial and real databases demonstrating good performance. Recent theoretical results support that decreasing the number of free parameters in a neural network (i.e., weights) can improve generalization. These results have triggered the development of many approaches which try to determine an `appropriate' network size for a given problem. The main goal has been to find a network size just large enough to capture the general class properties of the data. In some cases, however, network size is not reduced significantly or the reduction is satisfactory but generalization is affected. In this paper, we propose the coupling of genetic algorithms with weight elimination. Our objective is not only to significantly reduce network size, by pruning larger size networks, but also to preserve generalization, that is, to come up with pruned networks which generalize as good or even better than their unpruned counterparts. The innovation of our work relies on a fitness function which uses an adaptive parameter to encourage reproduction of networks having small size and good generalization. The proposed approach has been tested using both artificial and real databases demonstrating good performance.en_US
dc.formatpdfen_US
dc.language.isoenen_US
dc.relation.ispartofNeurocomputingen_US
dc.rights© Elsevieren_US
dc.subjectNeural networksen_US
dc.subjectGenetic algorithmsen_US
dc.subjectPruningen_US
dc.subjectDatabasesen_US
dc.titleCoupling weight elimination with genetic algorithms to reduce network size and preserve generalizationen_US
dc.typeArticleen_US
dc.collaborationUniversity of Central Floridaen_US
dc.subject.categoryElectrical Engineering - Electronic Engineering - Information Engineeringen_US
dc.journalsHybrid Open Accessen_US
dc.countryUnited Statesen_US
dc.subject.fieldEngineering and Technologyen_US
dc.publicationPeer Revieweden_US
dc.identifier.doi10.1016/S0925-2312(97)00050-7en_US
dc.dept.handle123456789/54en
dc.relation.issue3-4en_US
dc.relation.volume17en_US
cut.common.academicyear1997-1998en_US
dc.identifier.spage167en_US
dc.identifier.epage194en_US
item.grantfulltextnone-
item.languageiso639-1en-
item.cerifentitytypePublications-
item.openairecristypehttp://purl.org/coar/resource_type/c_6501-
item.openairetypearticle-
item.fulltextNo Fulltext-
crisitem.journal.journalissn0925-2312-
crisitem.journal.publisherElsevier-
crisitem.author.deptDepartment of Electrical Engineering, Computer Engineering and Informatics-
crisitem.author.facultyFaculty of Engineering and Technology-
crisitem.author.orcid0000-0003-3486-538x-
crisitem.author.parentorgFaculty of Engineering and Technology-
Appears in Collections:Άρθρα/Articles
CORE Recommender
Show simple item record

SCOPUSTM   
Citations

55
checked on Nov 9, 2023

WEB OF SCIENCETM
Citations

46
Last Week
0
Last month
0
checked on Oct 27, 2023

Page view(s) 50

422
Last Week
5
Last month
6
checked on Nov 7, 2024

Google ScholarTM

Check

Altmetric


Items in KTISIS are protected by copyright, with all rights reserved, unless otherwise indicated.