Ihre E-Mail wurde erfolgreich gesendet. Bitte prüfen Sie Ihren Maileingang.

Leider ist ein Fehler beim E-Mail-Versand aufgetreten. Bitte versuchen Sie es erneut.

Vorgang fortführen?

Exportieren
  • 1
    Online-Ressource
    Online-Ressource
    World Scientific Pub Co Pte Ltd ; 2023
    In:  International Journal of Pattern Recognition and Artificial Intelligence Vol. 37, No. 01 ( 2023-01)
    In: International Journal of Pattern Recognition and Artificial Intelligence, World Scientific Pub Co Pte Ltd, Vol. 37, No. 01 ( 2023-01)
    Kurzfassung: Compared to cross-entropy in deep learning, triplet loss is less affected by the biased label information and widely used in fine-grained visual tasks. Especially in person re-identification (re-ID), triplet loss is improved with batch-hard sampling which only selects the hardest samples during the training process to reduce invalid triplets involved in the loss computation. The hardest samples’ loss computation can provide a more intense gradient descent than raw samples. However, the batch-hard triplet loss discards multiple samples with important information, which can negatively impact feature learning. Besides, the hardest samples cause loss stuck problems frequently in training. In this work, we propose a balanced triplet loss for comprehensive feature learning and stable model convergence. The balanced triplet loss only mines the hardest negative samples of each category within a mini-batch. Compared with batch-hard triplet loss, it preserves the features of all the negative categories rather than one negative category with the hardest negative sample. It achieves a balance between triplet selection and information loss. The experiments show that our method can produce competitive results in re-ID tasks. In addition, we analyze the correlation between the intensity of data mining and the granularity of feature learning and further adapt the balanced triplet loss to general fine-grained image classification. The experiments prove the adapted balanced triplet loss also outperforms cross-entropy in multiple datasets of different scales.
    Materialart: Online-Ressource
    ISSN: 0218-0014 , 1793-6381
    Sprache: Englisch
    Verlag: World Scientific Pub Co Pte Ltd
    Publikationsdatum: 2023
    Bibliothek Standort Signatur Band/Heft/Jahr Verfügbarkeit
    BibTip Andere fanden auch interessant ...
Schließen ⊗
Diese Webseite nutzt Cookies und das Analyse-Tool Matomo. Weitere Informationen finden Sie auf den KOBV Seiten zum Datenschutz