Differentially-private federated learning with long-term constraints using online mirror descent

No Thumbnail Available
Authors
Odeyomi, Olusola T.
Záruba, Gergely V.
Advisors
Issue Date
2021-07-12
Type
Conference paper
Keywords
Differential privacy , Privacy , Computational modeling , Aggregates , Training data , Collaborative work , Servers
Research Projects
Organizational Units
Journal Issue
Citation
Odeyomi, O., & Zaruba, G. (2021). Differentially-private federated learning with long-term constraints using online mirror descent. Paper presented at the IEEE International Symposium on Information Theory - Proceedings, , 2021-July 1308-1313. doi:10.1109/ISIT45174.2021.9518177
Abstract

This paper discusses a fully decentralized online federated learning setting with long-term constraints. The fully decentralized setting removes communication and computational bottlenecks associated with a central server communicating with a large number of clients. Also, online learning is introduced to the federated learning setting to capture situations with time-varying data distribution. Practical federated learning settings are imposed with long-term constraints such as energy constraints, money cost constraints, time constraints etc. The clients are not obligated to satisfy any per round constraint, but they must satisfy these long-term constraints. To provide privacy to the shared local model updates of the clients, local differential privacy is introduced. An online mirror descent-based algorithm is proposed and its regret bound is obtained. The regret bound is compared with the regret bound of a differentially-private version of online gradient descent algorithm proposed for federated learning.

Table of Contents
Description
Click on the DOI link to access this conference paper at the publishers website (may not be free).
Publisher
IEEE
Journal
Book Title
Series
2021 IEEE International Symposium on Information Theory (ISIT);
PubMed ID
DOI
ISSN
EISSN