Show simple item record

dc.contributor.authorOdeyomi, Olusola T.
dc.contributor.authorZaruba, Gergely
dc.date.accessioned2022-04-16T20:58:13Z
dc.date.available2022-04-16T20:58:13Z
dc.date.issued2021-12-05
dc.identifier.citationO. T. Odeyomi and G. Zaruba, "Privacy-Preserving Online Mirror Descent for Federated Learning with Single-Sided Trust," 2021 IEEE Symposium Series on Computational Intelligence (SSCI), 2021, pp. 1-7, doi: 10.1109/SSCI50451.2021.9659544.en_US
dc.identifier.isbn978-1-7281-9048-8
dc.identifier.urihttps://soar.wichita.edu/handle/10057/23096
dc.identifier.urihttp://doi.org/10.1109/SSCI50451.2021.9659544
dc.descriptionClick on the DOI to access this article (may not be free).en_US
dc.description.abstractThis paper discusses how clients in a federated learning system can collaborate with privacy guarantee in a fully decentralized setting without a central server. Most existing work includes a central server that aggregates the local updates from the clients and coordinates the training. Thus, the setting in this existing work is prone to communication and computational bottlenecks, especially when large number of clients are involved. Also, most existing federated learning algorithms do not cater for situations where the data distribution is time-varying such as in real-time traffic monitoring. To address these problems, this paper proposes a differentially-private online mirror descent algorithm. To provide additional privacy to the loss gradients of the clients, local differential privacy is introduced. Simulation results are based on a proposed differentially-private exponential gradient algorithm, which is a variant of differentially-private online mirror descent algorithm with entropic regularizer. The simulation shows that all the clients can converge to the global optimal vector over time. The regret bound of the proposed differentially-private exponential gradient algorithm is compared with the regret bounds of some state-of-the-art online federated learning algorithms found in the literature.en_US
dc.language.isoen_USen_US
dc.publisherIEEEen_US
dc.relation.ispartofseries2021 IEEE Symposium Series on Computational Intelligence (SSCI);2021
dc.subjectFederated learningen_US
dc.subjectDifferential privacyen_US
dc.subjectOnline learningen_US
dc.subjectRegreten_US
dc.subjectGraph theoryen_US
dc.titlePrivacy-preserving online mirror descent for federated learning with single-sided trusten_US
dc.typeConference paperen_US
dc.rights.holder©2021 IEEEen_US


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record