Workshop on Uncertainty Quantification for Climate Science
- When & where
- Presentation of the workshop
- Schedule and slides
- Mini-courses (Wednesday 12 November afternoon)
- Specialized talks (Thursday 13 November)
- List of participants
When & where
The workshop took place on 12 and 13 November 2025 at Institut Henri Poincaré, Paris.
Room: amphithéâtre Hermite (ground floor).

Presentation of the workshop
This workshop was jointly organized by RT UQ, institut des Mathématiques pour la Planète Terre and GdR Défis theoriques pour les sciences du climat.
The objective of this workshop was to bring together members of different communities to discuss the methodological and practical aspects of uncertainty quantification in climate data, models and simulations. The topics covered included, but were not limited to: uncertainty quantification methodology, data assimilation, statistical modeling of extreme events, characterization and modeling of uncertainties in observational data, and the integration of machine learning tools into climate simulations.
The workshop consisted of two mini-courses on the afternoon of Wednesday 12 November, followed by a full day of specialized presentations on Thursday 13 November.
Organizers: Aurélie Fischer, Corentin Herbert, Clémentine Prieur, Julien Reygner, Jean-Noël Thépaut.
Schedule and slides
Wednesday 12 November (minicourses)
13:15 - 13:30 Welcome
13:30 - 15:30 Eric Blayo: An introduction to data assimilation (slides)
15:30 - 16:00 Coffee break
16:00 - 18:00 Josselin Garnier: An introduction to uncertainty quantification (slides)
Thursday 13 November (research talks)
9:20 - 9:30 Welcome
9:30 - 10:15 Pierre Tandeo: Quantifying uncertainty in climate reanalyses (slides)
10:15 - 10:45 Coffee break
10:45 - 11:30 Marc Bocquet: Stochastic surrogate models, data assimilation, and uncertainty quantification (slides)
11:30 - 12:15 Maxime Taillardat: Correcting and using uncertainty quantification through statistical post-processing (slides)
12:15 - 14:00 Lunch (buffet at IHP)
14:00 - 14:45 Claire Monteleoni: Generative AI for Climate and Weather
14:45 - 15:30 Chris Merchant: Satellite Observations of Climate Change: Uncertainty and Stability (slides)
15:30 - 16:00 Coffee break
16:00 - 16:45 Laure Raynaud: UQ in weather forecasting: a practical view for operational applications, and the new paradigm of machine learning (slides)
16:45 - 17:30 Laurent Dubus: Uncertainty Matters: Implications for Energy Planning and Power System Resilience (slides)
Mini-courses (Wednesday 12 November afternoon)
- Eric Blayo, Université Grenoble Alpes: An introduction to data assimilation
- Josselin Garnier, École polytechnique: An introduction to uncertainty quantification
Specialized talks (Thursday 13 November)
- Marc Bocquet, École nationale des ponts et chaussées: Stochastic surrogate models, data assimilation, and uncertainty quantification
In a second example, I will show that sequential data assimilation itself can be learned, yielding methods that are significantly more efficient and robust than current state-of-the-art approaches. This, in turn, opens new algorithmic directions for the theory of data assimilation.
In both cases, the success of the neural network models critically relies on their ability to efficiently represent and quantify uncertainty.
- Laurent Dubus, RTE: Uncertainty Matters: Implications for Energy Planning and Power System Resilience
By highlighting challenges and practical implications, the talk aims to foster dialogue between climate scientists, energy modelers, and decision-makers, and to advocate for integrated approaches that better account for uncertainty in long-term energy planning.
- Chris Merchant, University of Reading: Satellite Observations of Climate Change: Uncertainty and Stability
A framework for organising and focusing the evaluation of uncertainty can be based on metrological principles adapted to the specific circumstances of Earth observations. While the uncertainty characterisation of CDRs in this manner requires significant effort, the process also challenges and improves the ideas and methods used to create these records.
A particular problem arises when quantifying long-term change. In the climate literature, the uncertainty in temporal trends is often discussed without considering observational stability. This is unsurprising, given that CDR creators usually don't provide information about stability. Furthermore, the discussion of stability in the context of requirements-setting for essential climate variables has arguably hindered progress in assessing it. Proposals to clarify the concept of stability will help creators and users of CDRs make more realistic evaluations of uncertainty in climate trends.
- Claire Monteleoni, Inria Paris and University of Colorado Boulder: Generative AI for Climate and Weather
Using AI methods — especially generative AI — in climate science provides additional sources of uncertainty beyond those already identified. However the potential for such methods is great. Our team has shown the benefit of generative AI methods for data fusion, interpolation, downscaling, and domain alignment. I will provide a survey of our recent work applying these methods to problems including weather forecasting, with a particular focus on extreme events, climate model emulation and scenario generation, and renewable energy planning.
- Laure Raynaud, Centre National de Recherches Météorologiques: UQ in weather forecasting: a practical view for operational applications, and the new paradigm of machine learning
- Maxime Taillardat, Centre National de Recherches Météorologiques: Correcting and using uncertainty quantification through statistical post-processing
- Pierre Tandeo, IMT Atlantique: Quantifying uncertainty in climate reanalyses
In this talk, I will demonstrate the importance of the Q and R covariances in estimating the uncertainty of data assimilation algorithms. I will review several algorithms for jointly estimating the Q and R matrices. I will also introduce a new metric to quantify the uncertainties of reanalyses: the credibility score. It is based on the notion of confidence interval, coverage probability, and allows us to check whether uncertainties are underestimated or overestimated.