2019 IEEE Global Communications Conference (GLOBECOM) | 2019

Heterogeneous Coded Computation across Heterogeneous Workers

 
 
 
 

Abstract


Coded distributed computing framework enables large-scale machine learning (ML) models to be trained efficiently in a distributed manner, while mitigating the straggler effect. In this work, we consider a multi-task assignment problem in a coded distributed computing system, where multiple masters, each with a different matrix multiplication task, assign computation tasks to workers with heterogeneous computing capabilities. Both dedicated and probabilistic worker assignment models are considered, with the objective of minimizing the average completion time of all tasks. For dedicated worker assignment, greedy algorithms are proposed and the corresponding optimal load allocation is derived based on the Lagrange multiplier method. For probabilistic assignment, successive convex approximation method is used to solve the non-convex optimization problem. Simulation results show that the proposed algorithms reduce the completion time by 80% over uncoded scheme, and 49% over an unbalanced coded scheme.

Volume None
Pages 1-6
DOI 10.1109/GLOBECOM38437.2019.9014006
Language English
Journal 2019 IEEE Global Communications Conference (GLOBECOM)

Full Text