Skip to main navigation Skip to search Skip to main content

Performance analysis of federated machine learning frameworks

  • Qinghe JING

Student thesis: Master's thesis

Abstract

The scarcity of data and isolated data islands encourage different organizations to share data with each other to train machine learning models. However, there are increasing concerns on the problems of data privacy and security, which urges people to seek a solution such as Federated Learning (FL) to share training data without violating data privacy. Google first introduced their solution for mobile devices in which users can form a federation and train a powerful model cooperatively, without leaking their own data. WeBank developed their Federated Transfer Learning (FTL) which extends FL applicable to more scenarios. However, the benefits come with a cost of extra computation and communication consumption, resulting in efficiency problems. In order to efficiently deploy and scale up Federated Learning solutions in production environment, we need a deep understanding on how the infrastructure affects the efficiency. This thesis tries to answer this question by quantitatively measuring real-word Federated Learning applications (TFF and FATE) on Google Cloud. According to the results of carefully designed experiments, we present the bottlenecks of each applications which can assist the future optimizations.
Date of Award2019
Original languageEnglish
Awarding Institution
  • The Hong Kong University of Science and Technology

Cite this

'