{ "id": "1801.08094", "version": "v1", "published": "2018-01-24T17:38:48.000Z", "updated": "2018-01-24T17:38:48.000Z", "title": "PRNN: Recurrent Neural Network with Persistent Memory", "authors": [ "Kui Zhao", "Yuechuan Li", "Chi Zhang", "Cheng Yang", "Shenghuo Zhu" ], "comment": "7 pages", "categories": [ "cs.LG", "cs.AI", "stat.ML" ], "abstract": "Although Recurrent Neural Network (RNN) has been a powerful tool for modeling sequential data, its performance is inadequate when processing sequences with multiple patterns. In this paper, we address this challenge by introducing an external memory and constructing a novel persistent memory augmented RNN (term as PRNN) model. The PRNN model captures the principle patterns in training sequences and stores them in the explicit memory. By leveraging the persistent memory, the proposed method can adaptively update states according to the similarities between encoded inputs and memory slots, leading to a stronger capacity in assimilating sequences with multiple patterns. Content-based addressing is suggested in memory accessing, and gradient descent is utilized for implicitly updating the memory. Experiments on several datasets demonstrate the effectiveness of the proposed method.", "revisions": [ { "version": "v1", "updated": "2018-01-24T17:38:48.000Z" } ], "analyses": { "keywords": [ "recurrent neural network", "novel persistent memory augmented rnn", "multiple patterns", "prnn model captures", "datasets demonstrate" ], "note": { "typesetting": "TeX", "pages": 7, "language": "en", "license": "arXiv", "status": "editable" } } }