Show simple item record

dc.contributor.authorOdeyomi, Olusola T.
dc.contributor.authorZaruba, Gergely
dc.identifier.citationO. T. Odeyomi and G. Zaruba, "Privacy-Preserving Online Mirror Descent for Federated Learning with Single-Sided Trust," 2021 IEEE Symposium Series on Computational Intelligence (SSCI), 2021, pp. 1-7, doi: 10.1109/SSCI50451.2021.9659544.en_US
dc.descriptionClick on the DOI to access this article (may not be free).en_US
dc.description.abstractThis paper discusses how clients in a federated learning system can collaborate with privacy guarantee in a fully decentralized setting without a central server. Most existing work includes a central server that aggregates the local updates from the clients and coordinates the training. Thus, the setting in this existing work is prone to communication and computational bottlenecks, especially when large number of clients are involved. Also, most existing federated learning algorithms do not cater for situations where the data distribution is time-varying such as in real-time traffic monitoring. To address these problems, this paper proposes a differentially-private online mirror descent algorithm. To provide additional privacy to the loss gradients of the clients, local differential privacy is introduced. Simulation results are based on a proposed differentially-private exponential gradient algorithm, which is a variant of differentially-private online mirror descent algorithm with entropic regularizer. The simulation shows that all the clients can converge to the global optimal vector over time. The regret bound of the proposed differentially-private exponential gradient algorithm is compared with the regret bounds of some state-of-the-art online federated learning algorithms found in the literature.en_US
dc.relation.ispartofseries2021 IEEE Symposium Series on Computational Intelligence (SSCI);2021
dc.subjectFederated learningen_US
dc.subjectDifferential privacyen_US
dc.subjectOnline learningen_US
dc.subjectGraph theoryen_US
dc.titlePrivacy-preserving online mirror descent for federated learning with single-sided trusten_US
dc.typeConference paperen_US
dc.rights.holder©2021 IEEEen_US

Files in this item


There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record