Séminaire de Probabilités et Statistique :
Le 01 avril 2019 à 13:45 - UM - Bât 09 - Salle de conférence (1er étage)
Présentée par Prieur Clémentine - Université Grenoble Alpes
Dimension reduction of the input parameter space for potentially vector-valued functions
Many problems that arise in uncertainty quantification, e.g., integrating or approximating multi- variate functions, suffer from the curse of dimensionality. The cost of computing a sufficiently accurate approximation grows indeed dramatically with the dimension of the input parameter space. It thus seems important to identify and exploit some notion of low-dimensional structure as, e.g., the intrinsic dimension of the model. A function varying primarily along a few directions of the input parameter space is said of low intrinsic dimension. In that setting, algorithms for quantifying uncertainty focusing on these important directions are expected to reduce the overall cost. A common approach to reducing a function?s input dimension is the truncated Karhunen-Loève decomposition [1], which exploits the correlation structure of the function?s input space. In the present talk, we propose to exploit not only input correlations but also the structure of the input-output map itself. We will first focus the presentation on approaches based on global sensitivity analysis. The main drawback of global sensitivity analysis is the cost required to estimate sensitivity indices such as Sobol? indices [2]. It is the main reason why we turn to the notion of active subspaces [3, 4] defined as eigenspaces of the average outer product of the function?s gradient with itself. They capture the directions along which the function varies the most, in the sense of its output responding most strongly to input perturbations, in expectation over the input measure. In particular, we will present recent results stated in [5] dealing with the framework of multivariate vector-valued functions. References [1] C. Schwab, R. A. Todor, Karhunen?Loève approximation of random fields by generalized fast multipole methods, Journal of Computational Physics 217 (1) (2006) 100?122. [2] I. Sobol, Sensitivity estimates for non linear mathematical models, Mathematical Modelling and Computational Experiments 1 (1993) 407?414. [3] P. G. Constantine, E. Dow, Q. Wang, Active subspace methods in theory and practice: Applications to kriging surfaces, SIAM Journal on Scientific Computing 36 (4) (2014) A1500?A1524. [4] P. G. Constantine, Active Subspaces: Emerging Ideas for Dimension Reduction in Parameter Stud- ies, Society for Industrial and Applied Mathematics, Philadelphia, 2015. [5] O. Zahm, P. Constantine, C. Prieur, Y. Marzouk, Gradient-based dimension reduction of multi- variate vector-valued functions (2018). URL https://hal.inria.fr/hal-01701425