{ "id": "2309.07367", "version": "v1", "published": "2023-09-14T01:00:05.000Z", "updated": "2023-09-14T01:00:05.000Z", "title": "The kernel-balanced equation for deep neural networks", "authors": [ "Kenichi Nakazato" ], "categories": [ "cond-mat.dis-nn", "cs.AI", "cs.LG" ], "abstract": "Deep neural networks have shown many fruitful applications in this decade. A network can get the generalized function through training with a finite dataset. The degree of generalization is a realization of the proximity scale in the data space. Specifically, the scale is not clear if the dataset is complicated. Here we consider a network for the distribution estimation of the dataset. We show the estimation is unstable and the instability depends on the data density and training duration. We derive the kernel-balanced equation, which gives a short phenomenological description of the solution. The equation tells us the reason for the instability and the mechanism of the scale. The network outputs a local average of the dataset as a prediction and the scale of averaging is determined along the equation. The scale gradually decreases along training and finally results in instability in our case.", "revisions": [ { "version": "v1", "updated": "2023-09-14T01:00:05.000Z" } ], "analyses": { "keywords": [ "deep neural networks", "kernel-balanced equation", "instability", "scale gradually decreases", "data space" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }