Séminaire de Probabilités et Statistique :
Le 13 mai 2024 à 13:45 - UM - Bât 09 - Salle de conférence (1er étage)
Présentée par Arbel Julyan - Inria Grenoble - Rhône-Alpes
Rapture of the deep: highs and lows of Bayes in a world of depths
Bayesian deep learning is appealing as it combines the coherence and natural uncertainty quantification of the Bayesian paradigm together with the expressivity and compositional flexibility of deep neural networks. Besides, it has the potential to provide learning mechanisms endowed with certain interpretability guarantees. In this talk, I will make an overview of distributional properties of Bayesian neural networks. This journey will start with early work by Radford Neal in the 90s. This led to the so-called Gaussian hypothesis of the pre-activations, which can be justified when the number of neurons per layer tends to infinity. I will then contrast this hypothesis with recent work on heavy-tailed pre-activations. Finally, I will describe a set of constraints that a neural network should fulfill to ensure Gaussian pre-activations.
Reference:
This review provides useful pointers to the literature:
A Primer on Bayesian Neural Networks: Review and Debates https://arxiv.org/abs/2309.16314
Séminaire en salle 109, également retransmis sur zoom :
https://umontpellier-fr.zoom.us/j/94087408185