Adaptive conjugate gradient algorithm for perceptron training
DSpace at IIT Bombay
View Archive InfoField | Value | |
Title |
Adaptive conjugate gradient algorithm for perceptron training
|
|
Creator |
NAGARAJA, G
BOSE, RPJC |
|
Subject |
linear inequalities
perceptron conjugate-gradient linear separability linear inequalities training |
|
Description |
An adaptive algorithm for function minimization based on conjugate gradients for the problem of finding linear discriminant functions in pattern classification is developed. The algorithm converges to a solution in both consistent and inconsistent cases in a finite number of steps on several datasets. We have applied our algorithm and compared its performance with the adaptive versions of the Ho-Kashyap procedure (AHK). We have also compared the batch version of the algorithm with the batch mode AHK. The results show that the proposed adaptive conjugate gradient algorithm (CGA) gives vastly superior performance in terms of both the number of training cycles required and the classification rate. Also, the batch mode CGA performs much better than the batch mode AHK. (c) 2005
|
|
Publisher |
ELSEVIER SCIENCE BV
|
|
Date |
2011-07-24T08:00:25Z
2011-12-26T12:55:38Z 2011-12-27T05:41:44Z 2011-07-24T08:00:25Z 2011-12-26T12:55:38Z 2011-12-27T05:41:44Z 2006 |
|
Type |
Article
|
|
Identifier |
NEUROCOMPUTING, 69(4-6), 368-386
0925-2312 http://dx.doi.org/10.1016/j.neucom.2005.03.007 http://dspace.library.iitb.ac.in/xmlui/handle/10054/6362 http://hdl.handle.net/10054/6362 |
|
Language |
en
|
|