{ "id": "1809.09399", "version": "v1", "published": "2018-09-25T10:29:18.000Z", "updated": "2018-09-25T10:29:18.000Z", "title": "Non-Iterative Knowledge Fusion in Deep Convolutional Neural Networks", "authors": [ "Mikhail Iu. Leontev", "Viktoriia Islenteva", "Sergey V. Sukhov" ], "comment": "19 pages, 8 figures", "categories": [ "cs.LG", "stat.ML" ], "abstract": "Incorporation of a new knowledge into neural networks with simultaneous preservation of the previous one is known to be a nontrivial problem. This problem becomes even more complex when new knowledge is contained not in new training examples, but inside the parameters (connection weights) of another neural network. Here we propose and test two methods allowing combining the knowledge contained in separate networks. One method is based on a simple operation of summation of weights of constituent neural networks. Another method assumes incorporation of a new knowledge by modification of weights nonessential for the preservation of already stored information. We show that with these methods the knowledge from one network can be transferred into another one non-iteratively without requiring training sessions. The fused network operates efficiently, performing classification far better than a chance level. The efficiency of the methods is quantified on several publicly available data sets in classification tasks both for shallow and deep neural networks.", "revisions": [ { "version": "v1", "updated": "2018-09-25T10:29:18.000Z" } ], "analyses": { "keywords": [ "deep convolutional neural networks", "non-iterative knowledge fusion", "performing classification far better", "deep neural networks", "constituent neural networks" ], "note": { "typesetting": "TeX", "pages": 19, "language": "en", "license": "arXiv", "status": "editable" } } }