{ "id": "2212.12639", "version": "v1", "published": "2022-12-24T02:56:48.000Z", "updated": "2022-12-24T02:56:48.000Z", "title": "A Note on Matrix Measure Flows, With Applications to the Contraction Analysis of Plastic Neural Networks", "authors": [ "Leo Kozachkov", "Jean-Jacques Slotine" ], "categories": [ "math.DS" ], "abstract": "Synapses--the connections between neurons in the brain--are constantly being updated. This updating is done for variety of reasons, such as helping the brain learn new tasks or adapt to new environments. However, synaptic plasticity poses a challenge for stability analyses of recurrent neural networks, which typically assume fixed synapses. To help overcome this challenge, we introduce the notion of a matrix measure flow. Given a matrix flow, a matrix measure flow captures the evolution of an associated matrix measure (or logarithmic norm). We show that for certain matrix flows of interest in computational neuroscience, the associated matrix measure flow obeys a simple inequality. This inequality can be used in turn to infer the stability and contraction properties of recurrent neural networks with plastic synapses. We consider examples of synapses undergoing Hebbian and/or Anti-Hebbian plasticity, as well as covariance-based and gradient-based rules.", "revisions": [ { "version": "v1", "updated": "2022-12-24T02:56:48.000Z" } ], "analyses": { "keywords": [ "plastic neural networks", "contraction analysis", "recurrent neural networks", "matrix flow", "applications" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }