Séance Séminaire

Séminaire de Probabilités et Statistique

lundi 23 mai 2022 à 13:45 - UM - Bât 09 - Salle de conférence (1er étage)

Camille Garcin (IMAG, INRIA, Université de Montpellier)

Stochastic smoothing of the top-K calibrated hinge loss for deep imbalanced classification

Modern classification tasks can include several thousand classes, possibly very similar. One such example is the Pl@ntNet application, which aims to provide users with the correct plant species given an input image. In this context, high ambiguity results in low top-1 accuracy. This motivates top-K classification, in which K possible classes are returned. Yet, proposing top-K losses (to minimize the top-K error) tailored for deep learning remains a challenge, both theoretically and practically. We will present a stochastic top-K hinge loss for deep learning inspired by recent developments on top-K calibrated losses. The proposal is based on the smoothing of the top-K operator building on the flexible "perturbed optimizer" framework. We show that our loss function performs well for balanced datasets. In addition, we propose a simple variant of our loss to handle imbalanced cases that significantly outperforms other baseline loss functions on Pl@ntNet-300K. The latter is an open dataset of plant images obtained from the Pl@ntNet application, characterized by high ambiguity and a long-tailed distribution, that we recently released.

This is joint work with Maximilien Servajean, Alexis Joly and Joseph Salmon.

Top-k loss paper : https://arxiv.org/pdf/2202.02193.pdf

Dataset paper : https://datasets-benchmarks-proceedings.neurips.cc/paper/2021/file/7e7757b1e12abcb736ab9a754ffb617a-Paper-round2.pdf

Séminaire à 13h45 en salle 109 (IMAG, bâtiment 9).
Également retransmis sur zoom : https://umontpellier-fr.zoom.us/j/94087408185