Workshop "Expériences et Simulations : comment les planifier et les exploiter au mieux ? "
Dates et lieu
6 et 7 novembre 2025 au Campus Hannah Arendt, Université d'Avignon, Avignon. Il est localisé au centre d'avignon, proche de la gare SNCF Avignon Centre (1km).
Présentation du workshop
Ce workshop, coorganisé par le Lab. de Math. d’Avignon, la CETAMA, le RT-UQ et le projet NEEDS/QUTHY (quantification d’incertitudes en thermohydraulique accidentelle pour la sûreté nucléaire), a pour objectif d’expliciter et d’illustrer l’importance de planifier les expériences (réelles ou issues de codes de simulation numérique) en lien avec leur exploitation. Un intérêt particulier sera porté sur les méthodes de plans d’expériences sous contraintes, la caractérisation de procédés nucléaires (CETAMA) et sur les thématiques du projet QUTHY (représentativité et exhaustivité des bases expérimentales, calage de paramètres et modélisation des sorties fonctionnelles de codes de calcul). Ce workshop réunira des chercheurs du monde académique et des chercheurs et ingénieurs issus de l’industrie, favorisant ainsi les échanges méthodologiques et les discussions sur les différentes difficultés qui se posent en pratique.
La langue utilisée sera le français ou l'anglais.
Organisateurs: Jean Baccou (ASNR), Marielle Crozet (CEA ISEC, Marcoule, CETAMA), Bertrand Iooss (EDF R&D), Céline Lacaux (Université d'Avignon, Lab. de Mathématiques d'Avignon), Amandine Marrel (CEA IRESNE, Cadarache)
Sponsored by: CEA, CETAMA, LMA, Avignon Université, GDR MASCOT-NUM
Orateurs
- Jean Baccou (ASNR) et Roman Sueur (EDF R&D) : Analyse multicritère de la représentativité de bases expérimentales : application dans les études en sûreté nucléaire
- Magalie Claeys-Bruno (Aix-Marseille Université) : L’algorithme WSP et ses variantes : une approche performante pour une sélection de points efficiente et ciblée
- Guillaume Damblin (CEA ISAS, Saclay) : Méthodes de quantification inverse des incertitudes
- Séverine Demeyer (LNE) : TBA
- Florian Gossard (Université de Toulouse) : Kriging-based metamodel for probability measures
- Céline Helbert (Ecole Centrale de Lyon) : A well-designed sampling strategy for constrained bayesian optimization under uncertainties
- Astrid Jourdan (Université de Pau et des Pays de l'Adour) : Space-filling designs dans le cadre de la formulation d'un mélange
- Amandine Marrel (CEA IRESNE, Cadarache) : TBA
- Arthur Pellet-Rostaing (CEA IRESNE, Cadarache) : Analyse de sensibilité globale des spectres gamma de diagraphies en puits de forage pour la prospection et l’exploitation minière de l'uranium
- Damien Perret (CEA ISEC, Marcoule) : Data-driven models to predict glass melt properties
- Luc Pronzato (CNRS, Université Côte d'Azur) : Greedy algorithms for incremental design with guaranteed packing and covering performance
- Lucia Sargentini et Alberto Ghione (CEA ISAS, Saclay) : How to quantify uncertainties in thermal-hydraulic simulations for nuclear safety? The international ATRIUM benchmark
Inscription
L'inscription est gratuite mais obligatoire.
Logements possibles
- Appart Hôtel Sainte Marthe
- Hôtel Médieval
- Hôtel Magnan
- Résidence Les Cordeliers
- Hôtel Ibis centre gare
Abstracts
* Jean Baccou (ASNR) et Roman Sueur (EDF R&D) : TBA
* Magalie Claeys-Bruno (Aix-Marseille Université) : L’algorithme WSP et ses variantes : une approche performante pour une sélection de points efficiente et ciblée - WSP Algorithm : An efficient approach for targeted and high-performance point selection
Nowadays, in domain such as genetics, genomics, astrophysics, and spectral analysis, a major challenge lies in dealing with very high-dimensional spaces (D > 1000), which often lead to prohibitively long simulation times—numerical experiments may sometimes require several days of computation. As a result, experimental design becomes essential. However, traditional design of experiments (DoE) methods are no longer well-suited to such contexts. We introduce a methodology for space-filling designs (SFDs)construction based on the WSP algorithm, which selects a subset of points from a candidate set to achieve a uniform distribution over the domain of interest. Several variants of this method will be presented, particularly in the context of mixture variables, active deep learning approaches, or an adjustment to obtain designs with specific experimental constraints, or when density is to be increased in a zone of particular interest.
* Guillaume Damblin (CEA ISAS, Saclay) : Méthodes de quantification inverse des incertitudes
Thermal-hydraulic numerical simulations are essential to study the behavior of nuclear power plants and to support safety demonstration for hypothetical accidental scenarios. Such simulations are established by making a trade-off between complexity and accuracy and, as a result, are affected by various types of uncertainty. In this presentation, we are interesting in quantifying epistemic uncertainty due both to model simplifications and our imperfect knowledge of the physical phenomena. To this end, we have carried out inverse statistical methods based on comparisons between simulations and experimental data in order to compute both parameter and model uncertainties as probability distributions. Several methodological contributions are presented, such as hierarchical Bayesian inversion and non-Gaussian extensions.
* Séverine Demeyer (LNE) : TBA
* Florian Gossard (Université de Toulouse) : Kriging-based metamodel for probability measures
The prediction of complex data is a key topic in several applications related to nuclear safety, particularly in thermohydraulics for the analysis of accidental scenarios by numerical simulation. In practice, the amount of simulated data may be insufficient to perform a detailed analysis of a phenomenon of interest due to the high computational cost of a simulation. To circumvent this difficulty, fast-to-evaluate models or metamodels can be used. In this paper we present the construction of a Kriging-based metamodel for complex data that are probability measures. We exploit the Wasserstein distance derived from optimal transport theory and propose to construct the Kriging estimator as a Wasserstein barycenter. This approach is then applied to the prediction of measures associated with the temperatures simulated in the core of a nuclear reactor during a loss of primary coolant accident.
* Céline Helbert (Ecole Centrale de Lyon) : A well-designed sampling strategy for constrained bayesian optimization under uncertainties
We consider the problem of chance constrained optimization where it is sought to optimize a function and satisfy constraints, which are both affected by uncertainties. The associated real world applications are particularly challenging because of their inherent computational cost. To tackle such problems, we propose to use a bayesian optimization method with a new well-designed sampling strategy. The main contribution of this work is an acquisition criterion that accounts for both the average improvement in objective function and the constraint reliability. The criterion is derived in the joint controlled-uncontrolled input space following the Stepwise Uncertainty Reduction method. Numerical studies on test functions are presented. Experimental comparisons show that our sampling strategy is effective in solving the optimization problem at hand. Extensions where our method is coupled either to sensitivity analysis to tackle higher dimensional problems, or to multifidelity approaches are discussed.
* Astrid Jourdan (Université de Pau et des Pays de l'Adour) : Space-filling designs dans le cadre de la formulation d'un mélange
* Amandine Marrel (CEA IRESNE, Cadarache) : TBA
* Arthur Pellet-Rostaing (CEA IRESNE, Cadarache) : Analyse de sensibilité globale des spectres gamma de diagraphies en puits de forage pour la prospection et l’exploitation minière de l'uranium.
* Damien Perret (CEA ISEC, Marcoule) : Data-driven models to predict glass melt properties
In situations where theoretical models cannot be efficiently applied to calculate glass properties, empirical statistical models are often required. It is typically the case when glass contains a high number of components. Since the end of the 19th century, it is known that under certain conditions, silicate glass properties can be expressed as a simple linear combination of oxide contents. This "Principle of Additivity" was initially introduced to calculate heat capacity of glass, before being extended during the 20th century to a larger number of properties: optical, thermal, mechanical or rheological properties. In the 1990s, a statistical methodology was developed to establish robust property-composition models applicable to the formulation of nuclear waste conditioning glass. Since the 2000s, significant increase in the power of computer tools has allowed to use highly efficient machine learning algorithms in the predictive methods. For example, glass transition temperature can now be accurately predicted by using neural networks.
Glass viscosity prediction is much more challenging because of huge variability of this property on temperature and composition scales. An innovative methodology is presented, combining statistical techniques of experimental designs, multilinear regression and machine learning. It uses glass formulation data generated at CEA over the past 30 years as well as large amount of data collected from the literature and from commercial database. Results obtained for glass transition temperature and viscosity predictions are very accurate, compared to other statistical models already published in the literature.
- Luc Pronzato (CNRS, Université Côte d'Azur) : Greedy algorithms for incremental design with guaranteed packing and covering performance
Many methods exist for such incremental design constructions, in particular kernel-based methods: maximum entropy sampling, minimisation of the integrated mean-squared prediction error of a kriging predictor, minimisation of a maximum mean discrepancy (taking the uniform measure on X as the target measure). Those we shall consider require no prior knowledge on the system under investigation and rely on the simple greedy-packing algorithm. This algorithm, whose complexity grows linearly in the design size k, generates designs that are at least 50% efficient for each k in terms of packing radius PR and covering radius CR, and are also optimal in the long run among all incremental constructions in terms of mesh-ratio CR/PR. We will see how a relaxation of the method can be used to generate designs that avoid the boundaries of X, or random designs in X, still with guaranteed performance in terms of packing and covering. The algorithm can be modified to take into account canonical projections of the design onto subspaces of interest, and posterior lower bounds on the packing and covering efficiencies of these projections can be provided. Efficiency bounds for packing and covering can be obtained in the same way for any given design.
- Lucia Sargentini et Alberto Ghione (CEA ISAS, Saclay) : How to quantify uncertainties in thermal-hydraulic simulations for nuclear safety? The international ATRIUM benchmark.
In this context, the OECD/NEA project ATRIUM (Application Tests for Realization of Inverse Uncertainty quantification and validation Methodologies in thermal-hydraulics) aims at performing practical IUQ exercises to bring new insights on the applicability of the SAPIUM best-practices and on the possible improvements necessary when quantifying model uncertainties. Two IUQ exercises with increasing complexity were performed to quantify the uncertainties associated to the critical flow and post-Critical Heat Flux (CHF) heat transfer phenomena, which are important during a Loss Of Coolant Accident (LOCA) in a nuclear reactor. A particular attention was dedicated to the evaluation of the adequacy of the experimental databases and on the determination of the strengths and weaknesses of the available IUQ methodologies. The final step of the project consists in the propagation of the quantified uncertainties on a representative LOCA experiment to validate their application in LOCA type experiments at larger scale