ANR Project: GATSBII (2025-2029)
- Project summary
- People
- Internal meetings
- Master internship, PhD thesis and Post-doctoral offers
- Publications
- Communications
- Software
GAme Theory and Statistical estimation Bring Importance Measures and Interpretability






Project summary
The extensive use of machine learning (ML) models in data-based artificial intelligence (AI) systems --especially those submitted to new European regulations-- require explanations, to serve the intelligibility and auditability of these AI systems. Explainability in AI (XAI) is linked to the ability, for a human mind, to understand and communicate about decisions proposed by ML models. One of the primary challenges in XAI is to understand the influence of the features ("inputs") on the predicted variables (“outputs”) and to provide global interpretability diagnostics. Strong connections between XAI and Global Sensitivity Analysis (GSA) of model outputs have been recently highlighted . Indeed, GSA aims at studying how the uncertainty in a model output can be apportioned to different sources of uncertainty in its inputs. In the last decades, applied mathematicians have developed several GSA indices (also called importance measures (IM)) to quantify the relative importance of the inputs on the output while heuristic methods are mainly used in XAI.
In GSA and XAI, several fundamental open issues remain. How to consider the statistical dependency between inputs, and in particular, how to distinguish its effects from the inputs' interaction effects? How to estimate indices when one cannot choose a priori the input design of experiments? How can we estimate online the indices? How can we deal with high dimensional models? These open issues are even more relevant in a industrial where the input variables are rarely independent and the design of experiment can not be chosen. The project aims to face these challenging questions by gathering together statisticians specialists/practitioners of GSA/XAI and researchers in Cooperative Game Theory by exploring several tracks (hierarchical models and tools from game theory to deal with dependency, stochastic algorithm to built on line estimators, ... )
People
- Thierry Klein (coordinator) - ENAC
- Bertrand Iooss - EDF R&D
- Agnès Lagnoux - Institut de Mathématiques de Toulouse, Université Toulouse Jean Jaurès
- Nicolas Bousquet - EDF R&D
- Stefano Moretti - Paris Dauphine
- Clément Bénard - Thales cortAIx-Labs
- Hugo Gilbert - Université Paris Dauphine - PSL
- Sébastien Da Veiga - CREST-ENSAI
Internal meetings
2025, January, 30-31
- Laura Clouvel, Bertrand Iooss and Vincent Chabridon (EDF R&D) - Importance measures with dependent inputs: linear and non-linear cases - gatsbiijan25_clouveliooss.pdf
- Stefano Moretti (UPD) - Social ranking for feature selection - leXAI300125.pdf
- Julien Pelamatti and Vincent Chabridon - Sensitivity analysis in presence of hierarchical variables
- Sébastien Gadat (Univ. Toulouse) - Global Sobol indices estimation with a sequential stochastic mirror descent
- Clément Benard (Thales) and Sébastien Da Veiga (ENSAI) - SHAFF: Fast and consistent Shapley effect estimates via random forests gatsbiijan25_benard.pdf
- Agnès Lagnoux (Univ. Toulouse) - Efficient estimation of Sobol’ indices of any order from a single input/output sample gatsbiijan25_lagnoux.pdf
- Paul Rochet (ENAC)
- Christophe Labreuche (Thales) - Winter values and Banzhaf indices for interpretability
Master internship, PhD thesis and Post-doctoral offers
- Institut de Mathématiques de Toulouse - Asymptotic efficiency and global sensitivity analysis - stage2025_eff.pdf - https://uq.math.cnrs.fr/gatsbii
- Institut de Mathématiques de Toulouse - Online estimation of Sobol indices, stochastic miror descent optimisation algorithm - stage2025_smd.pdf - https://uq.math.cnrs.fr/gatsbii