AI/ML

Network Parameter Learning Using Nonlinear Transforms, Local Representation Goals and Local Propagation Constraints. (arXiv:1902.00016v1 [cs.LG])



In this paper, we introduce a novel concept for learning of the parameters in
a neural network. Our idea is grounded on modeling a learning problem that
addresses a trade-off between (i) satisfying local objectives at each node and
(ii) achieving desired data propagation through the network under (iii) local
propagation constraints. We consider two types of nonlinear transforms which
describe the network representations. One of the nonlinear transforms serves as
activation function. The other one enables a locally adjusted, deviation
corrective components to be included in the update of the network weights in
order to enable attaining target specific representations at the last network
node. Our learning principle not only provides insight into the understanding
and the interpretation of the learning dynamics, but it offers theoretical
guarantees over decoupled and parallel parameter estimation strategy that
enables learning in synchronous and asynchronous mode. Numerical experiments
validate the potential of our approach on image recognition task. The
preliminary results show advantages in comparison to the state-of-the-art
methods, w.r.t. the learning time and the network size while having competitive
recognition accuracy.

Source link




Related posts

Scribble based 3D shape segmentation via weakly-supervised learning.

Newsemia

CPPred-FL: a sequence-based predictor for large-scale identification of cell-penetrating peptides by feature representation learning.

Newsemia

Smartphones in 2018: The biggest winners and losers

Newsemia

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy