arXiv:2202.00004 [cs.LG]AbstractReferencesReviewsResources
On Polynomial Approximation of Activation Function
Published 2022-01-29Version 1
In this work, we propose an interesting method that aims to approximate an activation function over some domain by polynomials of the presupposing low degree. The main idea behind this method can be seen as an extension of the ordinary least square method and includes the gradient of activation function into the cost function to minimize.
Comments: In this work, we proposed an interesting method to approximate the activation function by a polynomial the degree of which is preset low. Our method to approximate the activation function is much more flexible compared to the least square method in the sense that the additional parameters could better control the shape of the resulting polynomial to approximate
Related articles: Most relevant | Search more
arXiv:1809.03272 [cs.LG] (Published 2018-09-10)
Privacy-Preserving Deep Learning for any Activation Function
arXiv:1908.05660 [cs.LG] (Published 2019-08-16)
Effect of Activation Functions on the Training of Overparametrized Neural Nets
arXiv:2310.04327 [cs.LG] (Published 2023-10-06)
Program Synthesis with Best-First Bottom-Up Search