Neural Networks, IEEE - INNS - ENNS International Joint Conference on
Download PDF

Abstract

A new method is given for speeding up learning in a deep neural network with many hidden layers, by partially partitioning the network rather than fully interconnecting the layers. Empirical results are shown both for learning a simple Boolean function on a standard backprop network, and for learning two different, complex, real-world vision tasks on a more sophisticated convolutional network. In all cases, the performance of the proposed system was better than traditional systems. The partially-partitioned network outperformed both the fully-partitioned and fully-unpartitioned networks.
Like what you’re reading?
Already a member?Sign In
Member Price
$11
Non-Member Price
$21
Add to CartSign In
Get this article FREE with a new membership!

Related Articles