Journal of Lightwave Technology | 2021

Multivariate Machine Learning Models for Short-Term Forecast of Lightpath Performance

 
 
 

Abstract


Machine Learning (ML) is emerging as a promising solution for managing the physical of heterogeneous dynamic optical networks transporting applications in a software defined network (SDN) context, namely for performance prediction. We propose two multivariate neural network models based on Gated Recurrent Unit (GRU) and Long Short-Term Memory (LSTM) methods, trained with field performance data and features, for predicting lightpath signal-to-noise ratio (SNR) over forecast horizons of up to 4 days. The best performance is achieved by using a 5-feature LSTM multivariate model over forecast horizons of up to 96 hours, with an absolute maximum error (AME) of 0.90 dB, compared to 0.91 dB and 0.97 dB for the GRU and LSTM univariate models, respectively, and 1.21 dB for a persistence model. The 2-feature multivariate models obtained through feature engineering perform better than their univariate counterparts for forecast horizons of up to 40 hours. Lastly, we explore the concept of transfer learning (TL) by testing the trained multivariate LSTM and univariate GRU models on field data from two lightpaths carried on the same route. The TL models underperform the naive model for the lightpath carried in a different optical fiber. However, for the lightpath carried in the same optical fiber on a portion of the same route, the LSTM-based TL model outperforms the naive model with a difference of up to 0.11 dB at a 96-hour forecast horizon, compared to 0.30 dB for the lightpath in the source domain, while using 3 times less training data.

Volume None
Pages None
DOI 10.1109/jlt.2021.3110513
Language English
Journal Journal of Lightwave Technology

Full Text