Παρακαλώ χρησιμοποιήστε αυτό το αναγνωριστικό για να παραπέμψετε ή να δημιουργήσετε σύνδεσμο προς αυτό το τεκμήριο: https://hdl.handle.net/20.500.14279/1857
Τίτλος: Coupling weight elimination with genetic algorithms to reduce network size and preserve generalization
Συγγραφείς: Georgiopoulos, Michael N. 
Bebis, George N. 
Kasparis, Takis 
metadata.dc.contributor.other: Κασπαρής, Τάκης
Major Field of Science: Engineering and Technology
Field Category: Electrical Engineering - Electronic Engineering - Information Engineering
Λέξεις-κλειδιά: Neural networks;Genetic algorithms;Pruning;Databases
Ημερομηνία Έκδοσης: Νοε-1997
Πηγή: Neurocomputing, 1997, vol. 17, no. 3-4, pp. 167-194
Volume: 17
Issue: 3-4
Start page: 167
End page: 194
Περιοδικό: Neurocomputing 
Περίληψη: Recent theoretical results support that decreasing the number of free parameters in a neural network (i.e., weights) can improve generalization. These results have triggered the development of many approaches which try to determine an 'appropriate' network size for a given problem. The main goal has been to find a network size just large enough to capture the general class properties of the data. In some cases, however, network size is not reduced significantly or the reduction is satisfactory but generalization is affected. In this paper, we propose the coupling of genetic algorithms with weight elimination. Our objective is not only to significantly reduce network size, by pruning larger size networks, but also to preserve generalization, that is, to come up with pruned networks which generalize as good or even better than their unpruned counterparts. The innovation of our work relies on a fitness function which uses an adaptive parameter to encourage reproduction of networks having small size and good generalization. The proposed approach has been tested using both artificial and real databases demonstrating good performance. Recent theoretical results support that decreasing the number of free parameters in a neural network (i.e., weights) can improve generalization. These results have triggered the development of many approaches which try to determine an `appropriate' network size for a given problem. The main goal has been to find a network size just large enough to capture the general class properties of the data. In some cases, however, network size is not reduced significantly or the reduction is satisfactory but generalization is affected. In this paper, we propose the coupling of genetic algorithms with weight elimination. Our objective is not only to significantly reduce network size, by pruning larger size networks, but also to preserve generalization, that is, to come up with pruned networks which generalize as good or even better than their unpruned counterparts. The innovation of our work relies on a fitness function which uses an adaptive parameter to encourage reproduction of networks having small size and good generalization. The proposed approach has been tested using both artificial and real databases demonstrating good performance.
URI: https://hdl.handle.net/20.500.14279/1857
ISSN: 09252312
DOI: 10.1016/S0925-2312(97)00050-7
Rights: © Elsevier
Type: Article
Affiliation: University of Central Florida 
Εμφανίζεται στις συλλογές:Άρθρα/Articles

CORE Recommender
Δείξε την πλήρη περιγραφή του τεκμηρίου

SCOPUSTM   
Citations

55
checked on 9 Νοε 2023

WEB OF SCIENCETM
Citations

46
Last Week
0
Last month
0
checked on 27 Οκτ 2023

Page view(s) 50

389
Last Week
1
Last month
11
checked on 11 Μαϊ 2024

Google ScholarTM

Check

Altmetric


Όλα τα τεκμήρια του δικτυακού τόπου προστατεύονται από πνευματικά δικαιώματα