IEEE Communications Letters | 2021

Distributed Model Training Based on Data Parallelism in Edge Computing-Enabled Elastic Optical Networks

 
 
 
 
 
 

Abstract


The emergence of edge computing provides an effective solution to execute distributed model training (DMT). The deployment of training data among edge nodes affects the training efficiency and network resource usage. This letter aims for the efficient provisioning of DMT services by optimizing the partition and distribution of training data in edge computing-enabled optical networks. An integer linear programming (ILP) model and a data parallelism deployment algorithm (DPDA) are proposed to solve this problem. The performance of the proposed approaches is evaluated through simulation. Simulation results show that the proposed algorithm can deploy more DMT services compared with benchmark.

Volume 25
Pages 1241-1244
DOI 10.1109/LCOMM.2020.3041453
Language English
Journal IEEE Communications Letters

Full Text