{ "id": "1706.04265", "version": "v1", "published": "2017-06-13T21:57:53.000Z", "updated": "2017-06-13T21:57:53.000Z", "title": "Transfer entropy-based feedback improves performance in artificial neural networks", "authors": [ "Sebastian Herzog", "Christian Tetzlaff", "Florentin Wörgötter" ], "categories": [ "cs.LG", "cs.IT", "cs.NE", "math.IT" ], "abstract": "The structure of the majority of modern deep neural networks is characterized by uni- directional feed-forward connectivity across a very large number of layers. By contrast, the architecture of the cortex of vertebrates contains fewer hierarchical levels but many recurrent and feedback connections. Here we show that a small, few-layer artificial neural network that employs feedback will reach top level performance on a standard benchmark task, otherwise only obtained by large feed-forward structures. To achieve this we use feed-forward transfer entropy between neurons to structure feedback connectivity. Transfer entropy can here intuitively be understood as a measure for the relevance of certain pathways in the network, which are then amplified by feedback. Feedback may therefore be key for high network performance in small brain-like architectures.", "revisions": [ { "version": "v1", "updated": "2017-06-13T21:57:53.000Z" } ], "analyses": { "keywords": [ "transfer entropy-based feedback", "performance", "few-layer artificial neural network", "vertebrates contains fewer hierarchical levels", "modern deep neural networks" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }