Please use this identifier to cite or link to this item:
https://hdl.handle.net/20.500.14279/1857
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Georgiopoulos, Michael N. | - |
dc.contributor.author | Bebis, George N. | - |
dc.contributor.author | Kasparis, Takis | - |
dc.contributor.other | Κασπαρής, Τάκης | - |
dc.date.accessioned | 2013-02-18T13:15:04Z | en |
dc.date.accessioned | 2013-05-17T05:21:59Z | - |
dc.date.accessioned | 2015-12-02T09:50:25Z | - |
dc.date.available | 2013-02-18T13:15:04Z | en |
dc.date.available | 2013-05-17T05:21:59Z | - |
dc.date.available | 2015-12-02T09:50:25Z | - |
dc.date.issued | 1997-11 | - |
dc.identifier.citation | Neurocomputing, 1997, vol. 17, no. 3-4, pp. 167-194 | en_US |
dc.identifier.issn | 09252312 | - |
dc.identifier.uri | https://hdl.handle.net/20.500.14279/1857 | - |
dc.description.abstract | Recent theoretical results support that decreasing the number of free parameters in a neural network (i.e., weights) can improve generalization. These results have triggered the development of many approaches which try to determine an 'appropriate' network size for a given problem. The main goal has been to find a network size just large enough to capture the general class properties of the data. In some cases, however, network size is not reduced significantly or the reduction is satisfactory but generalization is affected. In this paper, we propose the coupling of genetic algorithms with weight elimination. Our objective is not only to significantly reduce network size, by pruning larger size networks, but also to preserve generalization, that is, to come up with pruned networks which generalize as good or even better than their unpruned counterparts. The innovation of our work relies on a fitness function which uses an adaptive parameter to encourage reproduction of networks having small size and good generalization. The proposed approach has been tested using both artificial and real databases demonstrating good performance. Recent theoretical results support that decreasing the number of free parameters in a neural network (i.e., weights) can improve generalization. These results have triggered the development of many approaches which try to determine an `appropriate' network size for a given problem. The main goal has been to find a network size just large enough to capture the general class properties of the data. In some cases, however, network size is not reduced significantly or the reduction is satisfactory but generalization is affected. In this paper, we propose the coupling of genetic algorithms with weight elimination. Our objective is not only to significantly reduce network size, by pruning larger size networks, but also to preserve generalization, that is, to come up with pruned networks which generalize as good or even better than their unpruned counterparts. The innovation of our work relies on a fitness function which uses an adaptive parameter to encourage reproduction of networks having small size and good generalization. The proposed approach has been tested using both artificial and real databases demonstrating good performance. | en_US |
dc.format | en_US | |
dc.language.iso | en | en_US |
dc.relation.ispartof | Neurocomputing | en_US |
dc.rights | © Elsevier | en_US |
dc.subject | Neural networks | en_US |
dc.subject | Genetic algorithms | en_US |
dc.subject | Pruning | en_US |
dc.subject | Databases | en_US |
dc.title | Coupling weight elimination with genetic algorithms to reduce network size and preserve generalization | en_US |
dc.type | Article | en_US |
dc.collaboration | University of Central Florida | en_US |
dc.subject.category | Electrical Engineering - Electronic Engineering - Information Engineering | en_US |
dc.journals | Hybrid Open Access | en_US |
dc.country | United States | en_US |
dc.subject.field | Engineering and Technology | en_US |
dc.publication | Peer Reviewed | en_US |
dc.identifier.doi | 10.1016/S0925-2312(97)00050-7 | en_US |
dc.dept.handle | 123456789/54 | en |
dc.relation.issue | 3-4 | en_US |
dc.relation.volume | 17 | en_US |
cut.common.academicyear | 1997-1998 | en_US |
dc.identifier.spage | 167 | en_US |
dc.identifier.epage | 194 | en_US |
item.grantfulltext | none | - |
item.languageiso639-1 | en | - |
item.cerifentitytype | Publications | - |
item.openairecristype | http://purl.org/coar/resource_type/c_6501 | - |
item.openairetype | article | - |
item.fulltext | No Fulltext | - |
crisitem.journal.journalissn | 0925-2312 | - |
crisitem.journal.publisher | Elsevier | - |
crisitem.author.dept | Department of Electrical Engineering, Computer Engineering and Informatics | - |
crisitem.author.faculty | Faculty of Engineering and Technology | - |
crisitem.author.orcid | 0000-0003-3486-538x | - |
crisitem.author.parentorg | Faculty of Engineering and Technology | - |
Appears in Collections: | Άρθρα/Articles |
CORE Recommender
SCOPUSTM
Citations
55
checked on Nov 9, 2023
WEB OF SCIENCETM
Citations
46
Last Week
0
0
Last month
0
0
checked on Oct 27, 2023
Page view(s) 50
422
Last Week
5
5
Last month
6
6
checked on Nov 7, 2024
Google ScholarTM
Check
Altmetric
Items in KTISIS are protected by copyright, with all rights reserved, unless otherwise indicated.