57. pipelined parallel contrastive divergence for continuous generative model learning

Department: Bioengineering
Research Institute Affiliation: Graduate Program in Computational Science, Mathematics, and Engineering (CSME)
Faculty Advisor(s): Gert Cauwenberghs
Award(s): Best Literature Review Award

Primary Student
Name: Bruno Umbria Pedroni
Email: bpedroni@ucsd.edu
Phone: 619-534-2025
Grad Year: 2018

Student Collaborators
Sadique Sheik, ssheik@ucsd.edu

Abstract
A method for continuously processing and learning from data in Restricted Boltzmann Machines (RBMs) is proposed. Traditionally, RBMs are trained using Contrastive Divergence (CD), which is an algorithm consisting of two phases, of which only one is driven by data. This not only prohibits training of RBMs in conjugation with continuous-time data streams, especially in event-based real-time systems, but also hinders training speed of RBMs in large-scale machine learning systems. The model we propose trades time for space and, by pipelining information propagation in the network, is capable of processing both phases of the CD learning algorithm simultaneously. Simulation results of our model on generative and discriminative tasks show convergence to the original CD algorithm. We finalize with a discussion of applying our method to other deep neural networks, resulting in continuous learning and training time reduction.

Industry Application Area(s)
Aerospace, Defense, Security | Electronics/Photonics | Software, Analytics

« Back to Posters or Search Results