Séminaire des Doctorant·e·s :

Le 04 décembre 2024 à 17:00 - Salle 109


Présentée par Thuot Victor -

Introduction to bandit theory



The multi-armed bandit problem is a powerful model for sequential decision-making under uncertainty, offering a practical framework for numerous real-world applications. From clinical trials to online recommendation systems, it captures scenarios where a learner must balance exploration and exploitation to optimize outcomes. In this tutorial-style presentation, I will introduce the multi-armed bandit problem, discuss its underlying motivations, and examine some of the best-known yet conceptually accessible algorithms from the literature.



Retour