{ "id": "2306.09750", "version": "v1", "published": "2023-06-16T10:34:49.000Z", "updated": "2023-06-16T10:34:49.000Z", "title": "Fedstellar: A Platform for Decentralized Federated Learning", "authors": [ "Enrique Tomás Martínez Beltrán", "Ángel Luis Perales Gómez", "Chao Feng", "Pedro Miguel Sánchez Sánchez", "Sergio López Bernal", "Gérôme Bovet", "Manuel Gil Pérez", "Gregorio Martínez Pérez", "Alberto Huertas Celdrán" ], "categories": [ "cs.LG", "cs.AI", "cs.DC", "cs.NI" ], "abstract": "In 2016, Google proposed Federated Learning (FL) as a novel paradigm to train Machine Learning (ML) models across the participants of a federation while preserving data privacy. Since its birth, Centralized FL (CFL) has been the most used approach, where a central entity aggregates participants' models to create a global one. However, CFL presents limitations such as communication bottlenecks, single point of failure, and reliance on a central server. Decentralized Federated Learning (DFL) addresses these issues by enabling decentralized model aggregation and minimizing dependency on a central entity. Despite these advances, current platforms training DFL models struggle with key issues such as managing heterogeneous federation network topologies. To overcome these challenges, this paper presents Fedstellar, a novel platform designed to train FL models in a decentralized, semi-decentralized, and centralized fashion across diverse federations of physical or virtualized devices. The Fedstellar implementation encompasses a web application with an interactive graphical interface, a controller for deploying federations of nodes using physical or virtual devices, and a core deployed on each device which provides the logic needed to train, aggregate, and communicate in the network. The effectiveness of the platform has been demonstrated in two scenarios: a physical deployment involving single-board devices such as Raspberry Pis for detecting cyberattacks, and a virtualized deployment comparing various FL approaches in a controlled environment using MNIST and CIFAR-10 datasets. In both scenarios, Fedstellar demonstrated consistent performance and adaptability, achieving F1 scores of 91%, 98%, and 91.2% using DFL for detecting cyberattacks and classifying MNIST and CIFAR-10, respectively, reducing training time by 32% compared to centralized approaches.", "revisions": [ { "version": "v1", "updated": "2023-06-16T10:34:49.000Z" } ], "analyses": { "keywords": [ "decentralized federated learning", "fedstellar", "heterogeneous federation network topologies", "central entity", "platforms training dfl models struggle" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }