Séminaire de Probabilités et Statistique :

Le 12 septembre 2022 à 13:45 - UM - Bât 09 - Salle de conférence (1er étage)


Présentée par Gribonval Rémi - INRIA - ENS Lyon

Rapture of the deep: highs and lows of sparsity in a world of depths



Attempting to promote sparse connections in neural networks is natural to control their computational complexity. Besides, given its thoroughly documented role in inverse problems and variable selection, sparsity also has the potential to give rise to learning mechanisms endowed with certain interpretability guarantees. Through an overview of recent explorations around this theme, I will compare and contrast classical sparse regularization for inverse problems with multilayer sparse regularization. During our journey I will highlight the potential of an invariant path-embedding of the parameters of a deep network, both to learn the parameters and to analyze their identifiability from the function implemented by the network. In the process, we will be remembered that there is life beyond gradient descent, as illustrated by an algorithm that brings speedups of up to two orders of magnitude when learning certain fast transforms via multilayer sparse factorization.

Séminaire en salle 109, pas de retransmission sur zoom.



Retour