Yongxian Wei, Xiu-Shen Wei. Task-specific Part Discovery for Fine-grained Few-shot Classification[J]. Machine Intelligence Research, 2024, 21(5): 954-965. DOI: 10.1007/s11633-023-1451-7
Citation: Yongxian Wei, Xiu-Shen Wei. Task-specific Part Discovery for Fine-grained Few-shot Classification[J]. Machine Intelligence Research, 2024, 21(5): 954-965. DOI: 10.1007/s11633-023-1451-7

Task-specific Part Discovery for Fine-grained Few-shot Classification

  • Localizing discriminative object parts (e.g., bird head) is crucial for fine-grained classification tasks, especially for the more challenging fine-grained few-shot scenario. Previous work always relies on the learned object parts in a unified manner, where they attend the same object parts (even with common attention weights) for different few-shot episodic tasks. In this paper, we propose that it should adaptively capture the task-specific object parts that require attention for each few-shot task, since the parts that can distinguish different tasks are naturally different. Specifically for a few-shot task, after obtaining part-level deep features, we learn a task-specific part-based dictionary for both aligning and reweighting part features in an episode. Then, part-level categorical prototypes are generated based on the part features of support data, which are later employed by calculating distances to classify query data for evaluation. To retain the discriminative ability of the part-level representations (i.e., part features and part prototypes), we design an optimal transport solution that also utilizes query data in a transductive way to optimize the aforementioned distance calculation for the final predictions. Extensive experiments on five fine-grained benchmarks show the superiority of our method, especially for the 1-shot setting, gaining 0.12%, 8.56% and 5.87% improvements over state-of-the-art methods on CUB, Stanford Dogs, and Stanford Cars, respectively.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return