2023Activity reportProject-TeamSIMSMART
RNSR: 201822633C- Research center Inria Centre at Rennes University
- In partnership with:CNRS, Université de Rennes
- Team name: SIMulating Stochastic Models with pARTicles
- In collaboration with:Institut de recherche mathématique de Rennes (IRMAR)
- Domain:Applied Mathematics, Computation and Simulation
- Theme:Stochastic approaches
Keywords
Computer Science and Digital Science
- A6. Modeling, simulation and control
- A6.1. Methods in mathematical modeling
- A6.1.1. Continuous Modeling (PDE, ODE)
- A6.1.2. Stochastic Modeling
- A6.1.4. Multiscale modeling
- A6.2. Scientific computing, Numerical Analysis & Optimization
- A6.2.1. Numerical analysis of PDE and ODE
- A6.2.2. Numerical probability
- A6.2.3. Probabilistic methods
- A6.2.4. Statistical methods
- A6.2.5. Numerical Linear Algebra
- A6.2.6. Optimization
- A6.3. Computation-data interaction
- A6.3.1. Inverse problems
- A6.3.2. Data assimilation
- A6.3.4. Model reduction
- A6.3.5. Uncertainty Quantification
- A6.5. Mathematical modeling for physical sciences
- A6.5.2. Fluid mechanics
- A6.5.3. Transport
- A6.5.5. Chemistry
Other Research Topics and Application Domains
- B1. Life sciences
- B2. Health
- B3. Environment and planet
- B3.2. Climate and meteorology
- B4. Energy
- B4.2. Nuclear Energy Production
- B4.2.1. Fission
- B5.3. Nanotechnology
- B5.5. Materials
1 Team members, visitors, external collaborators
Research Scientists
- Mathias Rousset [Team leader, INRIA, Researcher, HDR]
- Frédéric Cérou [INRIA, Researcher]
- Cédric Herzet [INRIA, Researcher, until Aug 2023, HDR]
- Patrick Héas [INRIA, Researcher]
Faculty Member
- Valérie Monbet [UNIV RENNES, Professor, HDR]
PhD Students
- François Ernoult [UNIV RENNES, until Sep 2023]
- Theo Guyard [INSA RENNES]
- Thu Le Tran [UNIV RENNES, until Sep 2023]
Interns and Apprentices
- Chloé Caville [CENTRALE NANTES, Intern, from Apr 2023 until Aug 2023]
Administrative Assistant
- Gunther Tessier [INRIA]
2 Overall objectives
As the constant surge of computational power is nurturing scientists into simulating the most detailed features of reality, from complex molecular systems to climate or weather forecast, the computer simulation of physical systems is becoming reliant on highly complex stochastic dynamical models and very abundant observational data. The complexity of such models and of the associated observational data stems from intrinsic physical features, which do include high dimensionality as well as intricate temporal and spatial multi-scales. It also results in much less control over simulation uncertainty.
Within this highly challenging context, SIMSMART positions itself as a mathematical and computational probability and statistics research team, dedicated to Monte Carlo simulation methods. Such methods include in particular particle Monte Carlo methods for rare event simulation, data assimilation and model reduction, with application to stochastic random dynamical physical models. The main objective of SIMSMART is to disrupt this now classical field by creating deeper mathematical frameworks adapted to the management of contemporary highly sophisticated physical models.
3 Research program
Introduction. Computer simulation of physical systems is becoming increasingly reliant on highly complex models, as the constant surge of computational power is nurturing scientists into simulating the most detailed features of reality – from complex molecular systems to climate/weather forecast.
Yet, when modeling physical reality, bottom-up approaches are stumbling over intrinsic difficulties. First, the timescale separation between the fastest simulated microscopic features, and the macroscopic effective slow behavior becomes huge, implying that the fully detailed and direct long time simulation of many interesting systems (e.g. large molecular systems) are out of reasonable computational reach. Second, the chaotic dynamical behaviors of the systems at stake, coupled with such multi-scale structures, exacerbate the intricate uncertainty of outcomes, which become highly dependent on intrinsic chaos, uncontrolled modeling, as well as numerical discretization. Finally, the massive increase of observational data addresses new challenges to classical data assimilation, such as dealing with high dimensional observations and/or extremely long time series of observations.
SIMSMART Identity. Within this highly challenging applicative context, SIMSMART positions itself as a computational probability and statistics research team, with a mathematical perspective. Our approach is based on the use of stochastic modeling of complex physical systems, and on the use of Monte Carlo simulation methods, with a strong emphasis on dynamical models. The two main numerical tasks of interest to SIMSMART are the following: (i) simulating with pseudo-random number generators - a.k.a. sampling - dynamical models of random physical systems, (ii) sampling such random physical dynamical models given some real observations - a.k.a. Bayesian data assimilation. SIMSMART aims at providing an appropriate mathematical level of abstraction and generalization to a wide variety of Monte Carlo simulation algorithms in order to propose non-superficial answers to both methodological and mathematical challenges. The issues to be resolved include computational complexity reduction, statistical variance reduction, and uncertainty quantification.
SIMSMART Objectives. The main objective of SIMSMART is to disrupt this now classical field of particle Monte Carlo simulation by creating deeper mathematical frameworks adapted to the challenging world of complex (e.g. high dimensional and/or multi-scale), and massively observed systems, as described in the beginning of this introduction.
To be more specific, we will classify SIMSMART objectives using the following four intertwined topics:
- Objective 1: Rare events and random simulation.
- Objective 2: High dimensional and advanced particle filtering.
- Objective 3: Non-parametric approaches.
- Objective 4: Model reduction and sparsity.
Rare events Objective 1 are ubiquitous in random simulation, either to accelerate the occurrence of physically relevant random slow phenomenons, or to estimate the effect of uncertain variables. Objective 1 will be mainly concerned with particle methods where splitting is used to enforce the occurrence of rare events.
The problem of high dimensional observations, the main topic in Objective 2, is a known bottleneck in filtering, especially in non-linear particle filtering, where linear data assimilation methods remain the state-of-the-art approaches.
The increasing size of recorded observational data and the increasing complexity of models also suggest to devote more effort into non-parametric data assimilation methods, the main issue of Objective 3.
In some contexts, for instance when one wants to compare solutions of a complex (e.g. high dimensional) dynamical systems depending on uncertain parameters, the construction of relevant reduced-order models becomes a key topic. Model reduction aims at proposing efficient algorithmic procedures for the resolution (to some reasonable accuracy) of high-dimensional systems of parametric equations. This overall objective entails many different subtasks:1) the identification of low-dimensional surrogates of the target “solution’’ manifold, 2) The devise of efficient methodologies of resolution exploiting low-dimensional surrogates, 3) The theoretical validation of the accuracy achievable by the proposed procedures. This is the content of Objective 4.
With respect to volume of research activity, Objective 1, Objective 4 and the sum (Objective 2+Objective 3) are comparable.
Some new challenges in the simulation and data assimilation of random physical dynamical systems have become prominent in the last decade. A first issue (i) consists in the intertwined problems of simulating on large, macroscopic random times, and simulating rare events (see Objective 1). The link between both aspects stems from the fact that many effective, large times dynamics can be approximated by sequences of rare events. A second, obvious, issue (ii) consists in managing very abundant observational data (see Objective 2 and 3). A third issue (iii) consists in quantifying uncertainty/sensitivity/variance of outcomes with respect to models or noise. A fourth issue (iv) consists in managing high dimensionality, either when dealing with complex prior physical models, or with very large data sets. The related increase of complexity also requires, as a fifth issue (v), the construction of reduced models to speed-up comparative simulations (see Objective 4). In a context of very abundant data, this may be replaced by a sixth issue (vi) where complexity constraints on modeling is replaced by the use of non-parametric statistical inference (see Objective 3).
Hindsight suggests that all the latter challenges are related. Indeed, the contemporary digital condition, made of a massive increase in computational power and in available data, is resulting in a demand for more complex and uncertain models, for more extreme regimes, and for using inductive approaches relying on abundant data. In particular, uncertainty quantification (item (iii)) and high dimensionality (item (iv)) are in fact present in all 4 Objectives considered in SimSmart.
4 Application domains
4.1 Domain 1 – Computational Physics
The development of large-scale computing facilities has enabled simulations of systems at the atomistic scale on a daily basis. The aim of these simulations is to bridge the time and space scales between the macroscopic properties of matter and the stochastic atomistic description. Typically, such simulations are based on the ordinary differential equations of classical mechanics supplemented with a random perturbation modeling temperature, or collisions between particles.
Let us give a few examples. In bio-chemistry, such simulations are key to predict the influence of a ligand on the behavior of a protein, with applications to drug design. The computer can thus be used as a numerical microscope in order to access data that would be very difficult and costly to obtain experimentally. In that case, a rare event (Objective 1) is given by a macroscopic system change such as a conformation change of the protein. In nuclear safety, such simulations are key to predict the transport of neutrons in nuclear plants, with application to assessing aging of concrete. In that case, a rare event is given by a high energy neutron impacting concrete containment structures.
A typical model used in molecular dynamics simulation of open systems at given temperature is a stochastic differential equation of Langevin type. The large time behavior of such systems is typically characterized by a hopping dynamics between 'metastable' configurations, usually defined by local minima of a potential energy. In order to bridge the time and space scales between the atomistic level and the macroscopic level, specific algorithms enforcing the realization of rare events have been developed. For instance, splitting particle methods (Objective 1) have become popular within the computational physics community only within the last few years, partially as a consequence of interactions between physicists and Inria mathematicians in ASPI (parent of SIMSMART) and MATHERIALS project-teams.
SIMSMART also focuses on various models described by partial differential equations (reaction-diffusion, conservation laws), with unknown parameters modeled by random variables.
4.2 Domain 2 – Meteorology
The traditional trend in data assimilation in geophysical sciences (climate, meteorology) is to use as prior information some very complex deterministic models formulated in terms of fluid dynamics and reflecting as much as possible the underlying physical phenomenon (see e.g.). Weather/climate forecasting can then be recast in terms of a Bayesian filtering problem (see Objective 2) using weather observations collected in situ.
The main issue is therefore to perform such Bayesian estimations with very expensive infinite dimensional prior models, and observations in large dimension. The use of some linear assumption in prior models (Kalman filtering) to filter non-linear hydrodynamical phenomena is the state-of-the-art approach, and a current field of research, but is plagued with intractable instabilities.
This context motivates two research trends: (i) the introduction of non-parametric, model-free prior dynamics constructed from a large amount of past, recorded real weather data; and (ii) the development of appropriate non-linear filtering approaches (Objective 2 and Objective 3).
SIMSMART will also test its new methods on multi-source data collected in North-Atlantic paying particular attention to coastal areas (e.g. within the inter-Labex SEACS).
4.3 Other Applicative Domains
SIMSMART also focuses on other applications including:
- Tracking and hidden Markov models.
- Robustness and certification in Machine Learning.
5 Social and environmental responsibility
5.1 Footprint of research activities
Members of SimSmart have avoided air traveling, with the notable exception of rare international conferences with publications for PhD students (this year Theo Guyard) which are considered important for their academic future.
6 Highlights of the year
6.1 Awards
Théo Guyard, "Best student paper award", conference ROADEF 2023.
7 New software, platforms, open data
7.1 New software
7.1.1 Screening4L0Problem
-
Keywords:
Global optimization, Sparsity
-
Functional Description:
This software contains "Branch and bound" optimization routines exploiting "screening" acceleration rules for solving sparse representation problems involving the L0 pseudo-norm.
- URL:
- Publication:
-
Contact:
Cedric Herzet
-
Participants:
Clément Elvira, Theo Guyard, Cedric Herzet
7.1.2 Screen&Relax
-
Keywords:
Optimization, Sparsity
-
Functional Description:
This software provides optimization routines to efficiently solve the "ElasticNet" problem.
- URL:
- Publication:
-
Contact:
Cedric Herzet
-
Participants:
Clément Elvira, Theo Guyard, Cedric Herzet
7.1.3 npSEM
-
Name:
Stochastic expectation-maximization algorithm for non-parametric state-space models
-
Keyword:
Statistic analysis
-
Functional Description:
npSEM is the combination of a non-parametric estimate of the dynamic using local linear regression (LLR), a conditional particle smoother and a stochastic Expectation-Maximization (SEM) algorithm. Further details of its construction and implementation are introduced in the article An algorithm for non-parametric estimation in state-space models of authors "T.T.T. Chau, P. Ailliot, V. Monbet", https://doi.org/10.1016/j.csda.2020.107062.
- URL:
-
Contact:
Thi Tuyet Trang Chau
-
Participants:
Valérie Monbet, Thi Tuyet Trang Chau
7.1.4 NHMSAR
-
Name:
Non-Homogeneous Markov Switching Autoregressive Models
-
Keyword:
Statistical learning
-
Functional Description:
Calibration, simulation, validation of (non-)homogeneous Markov switching autoregressive models with Gaussian or von Mises innovations. Penalization methods are implemented for Markov Switching Vector Autoregressive Models of order 1 only. Most functions of the package handle missing values.
- URL:
-
Contact:
Valérie Monbet
-
Participant:
Valérie Monbet
7.1.5 3D Winds Fields Profiles
-
Functional Description:
The algorithm computes 3D Atmospheric Motion Vectors (AMVs) vertical profiles, using incomplete maps of humidity, temperature and ozone concentration observed in a range of isobaric levels. The code is implemented for operational use with the Infrared Atmospheric Sounding Interferometer (IASI) carried on the MetOp satellite.
- URL:
-
Contact:
Patrick Heas
-
Participant:
Patrick Heas
7.1.6 Screening4SLOPE
-
Keyword:
Optimization
-
Functional Description:
This software provides optimization routines to solve the SLOPE problem by exploiting "safe screening" reduction techniques.
- URL:
- Publication:
-
Contact:
Cedric Herzet
-
Participants:
Clément Elvira, Cedric Herzet
8 New results
8.1 Objective 1 – Monte Carlo simulation and Stochastic analysis
Monte-Carlo simulation
Participants: Frédéric Cérou, Patrick Héas, Mathias Rousset, François Ernoult.
In 3 we study a real world high dimensional Bayesian sampling problem (weather variables observed by space imagery) using kinetic Langevin diffusions (Hamiltonian Monte Carlo), and show empirically the advantage for convergence of an artificial “cold” tempering taming the non-linearities of the likelihood.
In 12, we obtained a rigorous general theorem for strong noise homogeneisation, a recent problem where a strong (fast) noise coerce a Markov process near an effective sub-state space. Applications include quantum control and filtering.
In 11, we proposed a new rare event sampling methodology in a context where evaluations of the score function defining the rare event is amenable to reduced modeling with pointwise error bounds. The novelty is the use of an Importance Sampling cost criteria that automatically choose the level at which costly evaluation of the true model are performed.
In 10, we quantify the robustness of a trained network to input uncertainties with a splitting rare event simulation Monte Carlo method that uses gradient-informed (Langevin-like) kernel.
8.2 Objective 2 and 3 – Data assimilation and statistics
Participants: Patrick Héas, Valérie Monbet.
In 4, we present an efficient optic flow algorithm for the extraction of vertically resolved 3D atmospheric motion vector (AMV) fields from incomplete hyperspectral image data measures by infrared sounders.
In 1, a hidden latent Markovian model with parametric noise and non-parametric drift is inferred from an historical catalog of data (a time series) using a stochastic Expectation-Maximisation/Estimation iterative scheme (with iteration index ). In the latter, the smoothed (conditional on data) distribution of the step hidden model is simulated using advanced pathwise sequential Monte Carlo particle filters (Conditional Particle Filters with Backward simulation). The step model is then estimated using both parametric maximum likelihood and non-parametric / machine learning tools.
Motivated by applications to weather/climate data, estimation methods based on Gaussian mixtures for calibration of ensemble forecasts have been performed in 5.
8.3 Objective 4 – Model Reduction and Sparsity
Participants: Patrick Héas, Cédric Herzet, Théo Guyard.
In 2, we propose a methodology to accelerate the resolution of the socalled "Sorted LOne Penalized Estimation" (SLOPE) problem. Our method leverages the concept of "safe screening", well-studied in the literature for group-separable sparsity-inducing norms, and aims at identifying the zeros in the solution of SLOPE.
In 7, 8, 9, 13, we introduce a new methodology dubbed “safe peeling” to accelerate the resolution of L0-regularized least-squares problems via a Branch-and-Bound (BnB) algorithm. Our procedure enables to tighten the convex relaxation considered at each node of the BnB decision tree and therefore potentially allows for more aggressive pruning.
In the context of model reduction, an issue is to find fast algorithms to project onto low-dimensional, sparse models. 14 studies non-linear approximation of high-dimensional dynamical systems using low-rank dynamic mode decomposition and embedding of trajectories in a reproducing kernel Hilbert space (RKHS)
9 Bilateral contracts and grants with industry
9.1 Bilateral contracts with industry
9.1.1 CIFRE grants
-
PhD project of Victor Bertret: AI and stochastic control for automatic optimal driving of industrial systems with company Purecontrol.
Participants: Valérie Monbet.
9.1.2 Meteorological Satellite Data Processing
Participants: Patrick Héas.
Industrial Partner: EUMETSAT of Darmstadt.
Partner Contact: Regis.Borde@eumetsat.int
The transferred technology concerns an algorithm for the operational and real-time production of vertically resolved 3D atmospheric motion vector fields (AMVs) from measurements of new hyperspectral instruments: the infrared radiosounders on the third generation Meteosat satellites (MTG), developed by the European Space Agency (ESA) and the Infrared Atmospheric Sounding Interferometer (IASI) on MetOp-A and MetOp-B developed by the French Space Agency (CNES).
10 Partnerships and cooperations
10.1 National initiatives
10.1.1 ANR
-
ANR MELODY(2020-2024)
Participants: Cédric Herzet.
The MELODY project aims to bridge the physical model‐driven paradigm underlying ocean / atmosphere science and AI paradigms with a view to developing geophysically‐sound learning‐based and data‐driven representations of geophysical flows accounting for their key features (e.g., chaos, extremes, high‐dimensionality).
The partners involved in the project were: IMT Atlantique (PI: Ronan Fablet), Inria-Rennes, Inria-Grenoble, Laboratoire d'Océanographie Physique et Spatiale, Institut des géosciences et de l'environnement, Institut Pierre-Simon Laplace.
-
ANR SINEQ (2021-2025).
Participants: Mathias Rousset, Frédéric Cérou.
Simulating non-equilibrium stochastic dynamics. The goal of the SINEQ project is, within a mathematical perspective, to extend various variance reduction techniques used in the Monte Carlo computation of equilibrium properties of statistical physics models.
The partners involved in the project are: CERMICS (PI: G. Stoltz), CEREMADE and Inria Rennes.
10.2 Local initiatives
- Defi interdisciplinaire de l'Université de Rennes 1 (10kE): Caractérisation du microbiote intestinal de l’individu sédentaire et inactif à l’athlète de haut niveau. Une approche innovante pour prédire le métabolisme énergétique de l’organisme en réponse à l’exercice ?, avec Frédéric DERBRE (Laboratoire M2S / Université Rennes 2) .
11 Dissemination
Participants: Frédéric Cérou, Théo Guyard, Patrick Héas, Cédric Herzet, Valérie Monbet, Mathias Rousset.
11.1 Promoting scientific activities
Participants: Frédéric Cérou, Théo Guyard, Patrick Héas, Cédric Herzet, Valérie Monbet, Mathias Rousset.
11.1.1 Scientific events: organisation
- Frédéric Cérou is organizing the weekly Probability Seminar at IRMAR, Univ Rennes.
- Valérie Monbet: Co-organisation du workshop Roscoff, 13-15 nov 2023 "Data Science pour les risques côtiers".
11.1.2 Journal
Reviewer - reviewing activities
Many journals including top journals in applied mathematics, probability, and statistics.
11.1.3 Main talks
- Mathias Rousset, Monte Carlo methods and Applications.
- Théo Guyard PGMODAYS 2023, with contribution 6
- Théo Guyard, SIAM Conference on Optimization - Seattle, USA.
- Cédric Herzet, JSTAR.
- Valérie Monbet, ECMI 2023, Pologne.
11.1.4 Leadership within the scientific community
Valérie Monbet is:
- Co-directrice de l'Agence Lebesgue depuis janvier 2023
- Membre du bureau d'AMIES.
- Membre du CNU section 26 jusqu'en dec. 2023.
11.2 Teaching - Supervision - Juries
Participants: Patrick Héas, Cédric Herzet, Valérie Monbet, Mathias Rousset.
11.2.1 Teaching
Cédric Herzet has given:
- Traitement du signal, ENS Rennes, Master 1 : Responsable du module, 10h.
- Apprentissage, ENSAI Rennes, Master 2 : 9h.
- Régularisation, ENSAI, Master 2 : Responsable du module, 18h.
Mathias Rousset has given
- A course at summer school of ANR SINEQ.
- préparation à l'agrégation, modélistaion optino proba-stat.
Patrick Héas has given
- A course on Algorithmique et Complexité , Ecole supérieure d'ingénieurs de Rennes (ESIR), 2-ième année, université de Rennes 1.
11.2.2 Supervision
PhD Supervision and Defense
- M. Rousset has been supervising the PhD of: Karim Tit: Rare event analysis of the Reliability of Deep Neural Networks, CIFRE Inria and Thalès, starting Jan. 2021, co-supervision: T.Furon.
- M. Rousset has been supervising the PhD of François Ernoult, Small noise analysis of rare event splitting algorithms, UR1 and Région Bretagne, starting Sept. 2020. The PhD has been defended december 15 (HAL publication will be available next year).
- C. Herzet has been supervising the PhD of Théo Guyard, Screening methods for non-convex sparse representations. INSA, starting Sept. 2021, co-supervision James Ledoux.
- C. Herzet has been supervising the PhD of Thu Le Tran, Sparse representations in continuous dictionaries. Application to spectroscopy data. UR1, sarting Sept. 2020, co-supervision: Valérie Monbet.
- V. Monbet have been supervising the CIFRE PhD of .Victor Bertret with company Purecontrol (Rennes), Apprentissage machine et contrôle stochastique pour un pilotage automatique optimisé de systèmes industriels. Starting sept. 2023.
Other
Valérie Monbet a encadré le PostDoc (financé par l'Agence Lebesgue) de Pierre Houedry. Thème : Machine Learning et Deep Learning pour la détection et la classification de pollen dans l'air.
11.2.3 Juries
Valérie Monbet has been part of the following juries:
- HDR Pierre Tandeo (IMT atlantique, Brest)
- HDR Matthieu Marbac (ENSAI)
- HDR Audrey Lavenu (UR1)
- PhD Benjamin Dufée (INRIA)
Cédric Herzet has been part of the jury:
- PhD Gilles Monnoyer (UCLouvain, Belgique)
11.3 Popularization
Participants: Patrick Héas.
Patrick Héas: intervention au lycée Frédéric OZANAM dans une classe de seconde pour la présentation de l’activité d’un chercheur en mathématiques appliquées.
12 Scientific production
12.2 Publications of the year
International journals
International peer-reviewed conferences
Reports & preprints