Séminaire de Probabilités et Statistique :

Le 04 mars 2024 à 13:45 - UM - Bât 09 - Salle de conférence (1er étage)


Présentée par Herbreteau Sébastien - EPFL

Towards better conditioned and interpretable neural networks: a study of the normalization-equivariance property with application to image denoising



In many information processing systems, it may be desirable to ensure that any change in the input, whether by shifting or scaling, results in a corresponding change in the system response. While deep neural networks are gradually replacing all traditional automatic processing methods, they surprisingly do not guarantee such normalization-equivariance (scale & shift) property, which can be detrimental in many applications. Inspired by traditional methods in image denoising, we propose a methodology to adapt existing convolutional neural networks so that normalization-equivariance holds by design and without performance loss. Our main claim is that not only ordinary unconstrained convolutional layers, but also all activation functions, including the ReLU (rectified linear unit), which are applied element-wise to the pre-activated neurons, should be completely removed from neural networks and replaced by better conditioned alternatives. As a result, we show that better conditioning improves the interpretability but also the robustness of these networks to outliers, which is experimentally confirmed in the context of image denoising.

Séminaire en salle 109, également retransmis sur zoom : https://umontpellier-fr.zoom.us/j/94087408185



Retour