Prototype learning has achieved good performance in many
fields, showing higher flexibility and generalization. In this paper, we propose an efficient text line recognition method based on prototype learning with feature-level sliding windows for classification. In this framework, we combine weakly supervised discrimination and generation loss for learning feature representations with intra-class compactness and interclass separability. Then, dynamic weighting and pseudo-label filtering
are also adopted to reduce the influence of unreliable pseudo-labels and
improve training stability significantly. Furthermore, we introduce consistency regularization to obtain more reliable confidence distributions
and pseudo-labels. Experimental results on digital and Chinese handwritten text datasets demonstrate the superiority of our method and
justify advantages in transfer learning on small-size datasets.
修改评论