Archive | 2019

Gradient Descent Analysis: On Visualizing the Training of Deep Neural Networks

 
 
 

Abstract


We present an approach to visualizing gradient descent methods and discuss its application in the context of deep neural network (DNN) training. The result is a novel type of training error curve (a) that allows for an exploration of each individual gradient descent iteration at line search level; (b) that reflects how a DNN’s training error varies along each of the descent directions considered; (c) that is consistent with the traditional training error versus training iteration view commonly used to monitor a DNN’s training progress. We show how these three levels of detail can be easily realized as the three stages of Shneiderman’s Visual Information Seeking Mantra. This suggests the design and development of a new interactive visualization tool for the exploration of DNN learning processes. We present an example that showcases a conceivable interactive workflow when working with such a tool. Moreover, we give a first impression of a possible DNN hyperparameter analysis.

Volume None
Pages 338-345
DOI 10.5220/0007583403380345
Language English
Journal None

Full Text