Ninth International Symposium on Neural Networks (ISNN 2012)

Overcoming the Local-Minimum Problem in Training Multilayer Perceptrons with the NRAE Training Method

, , and

A method of training multilayer perceptrons (MLPs) to reach a global or nearly global minimum of the standard mean squared error (MSE) criterion is proposed. It has been found that the region in the weight space that does not have a local minimum of the normalized risk-averting error (NRAE) criterion expands strictly to the entire weight space as the risk-sensitivity index increases to infinity. If the MLP under training has enough hidden neurons, the MSE and NRAE criteria are both equal to nearly zero at a global or nearly global minimum. Training the MLP with the NRAE at a sufficiently large risk-sensitivity index can therefore effectively avoid non-global local minima. Numerical experiments show consistently successful convergence from different initial guesses of the weights of the MLP at a risk-sensitivity index over 106. The experiments are conducted on examples with non-global local minima of the MSE criterion that are difficult to escape from by training directly with the MSE criterion.


  • 497300 bytes
UMBC ebiquity