International Joint Conference on Neural Networks (IJCNN 2016),

Training Deep Neural Networks with Gradual Deconvexification

, , and

A new method of training deep neural networks including the convolutional network is proposed. The method deconvexifies the normalized risk-averting error (NRAE) gradually and switches to the risk-averting error (RAE) whenever RAE is computationally manageable. The method creates tunnels between the depressed regions around saddle points, tilts the plateaus, and eliminates nonglobal local minima. Numerical experiments show the effectiveness of gradual deconvexification as compared with unsupervised pretraining. After the minimization process, a statistical pruning method is used to enhance the generalization capability of the neural network under training. Numerical results show further reduction of the testing criterion.


  • 227096 bytes

learning, neural networks

InProceedings

Downloads: 296 downloads

UMBC ebiquity