Pattern Recognition, International Conference on
Download PDF

Abstract

In this paper, we propose to train the RBF neural network using a global descent method. Essentially, the method imposes a monotonic transformation on the training objective to improve numerical sensitivity without altering the relative orders of all local extrema. A gradient descent search which inherits the global descent property is derived to locate the global solution of an error objective. Numerical examples comparing the global descent algorithm with a gradient-based line-search algorithm shows superiority of the proposed global descent algorithm in terms of speed of convergence and quality of solution achieved.
Like what you’re reading?
Already a member?
Get this article FREE with a new membership!

Related Articles