Please use this identifier to cite or link to this item: http://ktisis.cut.ac.cy/handle/10488/7137
Title: Overtraining in fuzzy ARTMAP: myth or reality?
Authors: Kasparis, Takis 
Georgiopoulos, Michael N.
Koufakou, Anna
Keywords: Neural networks
Computational complexity
Fuzzy sets
Issue Date: 2001
Publisher: IEEE
Source: International Joint Conference on Neural Networks, 2001,Washington
Abstract: We examine the issue of overtraining in fuzzy ARTMAP. Over-training in fuzzy ARTMAP manifests itself in two different ways: 1) it degrades the generalization performance of fuzzy ARTMAP as training progresses; and 2) it creates unnecessarily large fuzzy ARTMAP neural network architectures. In this work we demonstrate that overtraining happens in fuzzy ARTMAP and propose an old remedy for its cure: cross-validation. In our experiments we compare the performance of fuzzy ARTMAP that is trained: 1) until the completion of training, 2) for one epoch, and 3) until its performance on a validation set is maximized. The experiments were performed on artificial and real databases. The conclusion derived from these experiments is that cross-validation is a useful procedure in fuzzy ARTMAP, because it produces smaller fuzzy ARTMAP architectures with improved generalization performance. The trade-off is that cross-validation introduces additional computational complexity in the training phase of fuzzy ARTMAP
URI: http://ktisis.cut.ac.cy/handle/10488/7137
ISSN: 10987576
DOI: 10.1109/IJCNN.2001.939529
Rights: © 2001 IEEE
Appears in Collections:Δημοσιεύσεις σε συνέδρια/Conference papers

Show full item record

Page view(s)

9
Last Week
1
Last month
1
checked on Jul 27, 2017

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.