Coded Distributed Computing for Hierarchical Multi-task Learning

Haoyang Hu*, Songze Li, Minquan Cheng, Youlong Wu*

*Corresponding author for this work

Research output: Chapter in Book/Conference Proceeding/ReportConference Paper published in a bookpeer-review

Abstract

In this paper, we consider a hierarchical distributed multi-task learning (MTL) system where distributed users wish to jointly learn different models orchestrated by a central server with the help of a layer of multiple relays. Since the users need to download different learning models in the downlink transmission, the distributed MTL suffers more severely from the communication bottleneck compared to the single-task learning system. To address this issue, we propose a coded hierarchical MTL scheme that exploits the connection topology and introduces coding techniques to reduce communication loads. It is shown that the proposed scheme can significantly reduce the communication loads both in the uplink and downlink transmissions between relays and the server. Moreover, we provide informationtheoretic lower bounds on the optimal uplink and downlink communication loads, and prove that the gaps between achievable upper bounds and lower bounds are within the minimum number of connected users among all relays.

Original languageEnglish
Title of host publication2023 IEEE Information Theory Workshop, ITW 2023
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages480-485
Number of pages6
ISBN (Electronic)9798350301496
DOIs
Publication statusPublished - 2023
Externally publishedYes
Event2023 IEEE Information Theory Workshop, ITW 2023 - Saint-Malo, France
Duration: 23 Apr 202328 Apr 2023

Publication series

Name2023 IEEE Information Theory Workshop, ITW 2023

Conference

Conference2023 IEEE Information Theory Workshop, ITW 2023
Country/TerritoryFrance
CitySaint-Malo
Period23/04/2328/04/23

Bibliographical note

Publisher Copyright:
© 2023 IEEE.

Keywords

  • Multi-task learning
  • coded computing
  • communication load
  • distributed learning
  • hierarchical systems

Fingerprint

Dive into the research topics of 'Coded Distributed Computing for Hierarchical Multi-task Learning'. Together they form a unique fingerprint.

Cite this