Neurocomputing | 2021
Self-paced active learning for deep CNNs via effective loss function
Abstract
Abstract Deep convolutional neural networks have achieved great success on computer vision tasks, but the lack of labeled samples hinders the application of it in the real world. Active learning is one of the effective methods to address large volumes of unlabeled data. It interactively selects a few samples based on a certain criterion and queries their labels from annotators. In order to reduce the labor of annotation, we propose a novel framework “self-paced multi-criteria active learning” on the image classification task. Unlike previous work, we treat each iteration of active learning as the step of self-paced learning which considers the model should gradually proceed from simplicity to complexity in training. In the stage of selecting samples, we combine clustering with a multiple criteria selection method to reduce the negative effects of hard samples. In the training phase, we design a similarity classification loss function to mitigate the impact of the deficiency of labeled data. Experiments on multiple datasets demonstrate that our proposed method can achieve higher performance than current approaches.