Access Restriction

Author Pape, Leo ♦ Gomez, Faustino ♦ Ring, Mark ♦ Schmidhuber, Jürgen
Source CiteSeerX
Content type Text
File Format PDF
Subject Domain (in DDC) Computer science, information & general works ♦ Data processing & computer science
Subject Keyword Modular Deep Belief Network ♦ Compact Representation ♦ Training Data ♦ Forgetting Problem ♦ Subsequent Training Stage ♦ Comparable Size Forget Feature Mapping ♦ Complete Training ♦ Deep Belief Network ♦ Relevant Feature Change ♦ Monolithic Dbns ♦ M-dbns Retain ♦ Input Distribution ♦ Unsupervised Modular Dbn ♦ High-dimensional Data ♦ Learning Method ♦ Mnist Digit
Abstract Deep belief networks (DBNs) are popular for learning compact representations of high-dimensional data. However, most approaches so far rely on having a single, complete training set. If the distribution of relevant features changes during subsequent training stages, the features learned in earlier stages are gradually forgotten. Often it is desirable for learning algorithms to retain what they have previously learned, even if the input distribution temporarily changes. This paper introduces the M-DBN, an unsupervised modular DBN that addresses the forgetting problem. M-DBNs are composed of a number of modules that are trained only on samples they best reconstruct. While modularization by itself does not prevent forgetting, the M-DBN additionally uses a learning method that adjusts each module’s learning rate proportionally to the fraction of best reconstructed samples. On the MNIST handwritten digit dataset module specialization largely corresponds to the digits discerned by humans. Furthermore, in several learning tasks with changing MNIST digits, M-DBNs retain learned features even after those features are removed from the training data, while monolithic DBNs of comparable size forget feature mappings learned before.
Educational Role Student ♦ Teacher
Age Range above 22 year
Educational Use Research
Education Level UG and PG ♦ Career/Technical Study
Learning Resource Type Article
Publisher Date 2011-01-01