03. High-dimensional metamodels and curse of dimensionality in Uncertainty Quantification: a challenge to be solved

State-of-the-art computer codes for simulating real physical systems (neuroscience, microwave systems, power systems, cyber-physical systems, smart grids, etc.)  are often characterized by a vast number of input stochastic parameters. Performing uncertainty quantification (UQ) tasks with Monte Carlo (MC) methods is almost always infeasible because of the need to perform hundreds of thousands or even millions of forward model evaluations in order to obtain convergent statistics. One, thus, tries to construct a cheap-to-evaluate metamodel (surrogate) model to replace the forward model solver. For systems with large numbers of input parameters, one has to deal with the curse of dimensionality – the exponential increase in the volume of the input space, as the number of input parameters increases linearly.

schedule

This project aims to test and develop techniques to build high-dimensional models for UQ. A combination of dimensionality reduction and advanced modeling techniques (such as high-dimensional model representation (HDMR) [1], deep neural networks [2], sparse grids, and local on-demand models) will be investigated and compared. Different benchmarks with an increasing complexity will be studied

Keywords: Uncertainty Quantification, high-dimensional models, dimensionality reduction, deep learning.

REFERENCES :

[1] E. Li, H. Wang, G. Li, “High dimensional model representation (HDMR) coupled intelligent sampling strategy for nonlinear problems”, Computer Physics Communications, vol. 183, no. 9, pp. 1947-1955, 2012.         
[2] R. K. Tripathy, I. Bilionis, “Deep UQ: Learning deep neural network surrogate models for high dimensional uncertainty quantification”, Journal of Computational Physics, vol. 375, pp. 565-588, 2018

2019

Promoters

Yves Rolain
Francesco Ferranti (Institut Mines-Télécom Atlantique)
Back to top