2021 IEEE 17th International Colloquium on Signal Processing & Its Applications (CSPA) | 2021

Optimized Machine Learning Algorithm using Hybrid Proximal Method with Spectral Gradient Techniques

 
 
 
 

Abstract


Deep learning models are widely implemented in various machines to perform complicated tasks. Therefore, a significant amount of research effort focuses on improving the implementation of such models. One of the key bottlenecks in model implementation is the lengthy and inefficient model training process. A large amount of literature was published to improve the training process. In this paper, the Spectral Proximal (SP) optimization method is studied and presented. The SP method is an optimization algorithm that combines the Multiple Damping Gradient (MDG) method with a sparsity optimizer to improve training efficiency in machine learning. The MDG algorithm utilizes a damping matrix to correct errors in the descent direction. On top of that, the sparsity optimizer eliminates insignificant elements in the solution to reduce unnecessary computation. We conducted a training experiment to evaluate the SP method against the Adam method. In the experiment, both methods are used to train You Only Look Once version 3 (YOLOv3) model with an object detection dataset, known as YYMNIST dataset. The dataset utilized selected images from the Modified National Institute of Standards and Technology (MNIST) dataset. From the experiment, SP method displays a higher convergence rate and achieved a slightly higher mean Average Precision (mAP) than Adam method. However, SP method requires slightly longer training time due to higher computational requirements.

Volume None
Pages 101-106
DOI 10.1109/CSPA52141.2021.9377294
Language English
Journal 2021 IEEE 17th International Colloquium on Signal Processing & Its Applications (CSPA)

Full Text