2021 International Joint Conference on Neural Networks (IJCNN) | 2021
Density-Fixing: Simple yet Effective Regularization Method based on the Class Priors
Abstract
Machine learning models suffer from overfitting, which is caused by a lack of labeled data. We proposed a framework of regularization methods, called density-fixing, that can be used commonly for supervised and semi-supervised learning to tackle this problem. Our proposed regularization method improves the generalization performance by forcing the model to approximate the class prior distribution or occurrence frequency. This regularization term is naturally derived from the formula of maximum likelihood estimation and is theoretically justified. We further investigated the asymptotic behavior of the proposed method and how the regularization term behaves when assuming a prior distribution of several classes in practice. We provide several theoretical analyses of the proposed method including asymptotic behavior. Our experimental results on multiple benchmark datasets are sufficient to support our argument, and we suggest that this simple and effective regularization method is useful in real-world machine learning problems.