{ "id": "2106.16043", "version": "v1", "published": "2021-06-30T13:12:05.000Z", "updated": "2021-06-30T13:12:05.000Z", "title": "Study of the robustness of neural networks based on spintronic neurons", "authors": [ "Eleonora Raimondo", "Anna Giordano", "Andrea Grimaldi", "Vito Puliafito", "Mario Carpentieri", "Zhongming Zeng", "Riccardo Tomasello", "Giovanni Finocchio" ], "categories": [ "cond-mat.mes-hall" ], "abstract": "Spintronic technology is emerging as a direction for the hardware implementation of neurons and synapses of neuromorphic architectures. In particular, a single spintronic device can be used to implement the nonlinear activation function of neurons. Here, we propose how to implement spintronic neurons with a sigmoidal and ReLU-like activation functions. We then perform a numerical experiment showing the robustness of neural networks made by spintronic neurons all having different activation functions to emulate device-to-device variations in a possible hardware implementation of the network. Therefore, we consider a vanilla neural network implemented to recognize the categories of the Mixed National Institute of Standards and Technology database, and we show an average accuracy of 98.87 % in the test dataset which is very close to the 98.89% as obtained for the ideal case (all neurons have the same sigmoid activation function). Similar results are also obtained with neurons having a ReLU-like activation function.", "revisions": [ { "version": "v1", "updated": "2021-06-30T13:12:05.000Z" } ], "analyses": { "keywords": [ "robustness", "relu-like activation function", "hardware implementation", "single spintronic device", "nonlinear activation function" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }