Coastal areas are increasingly threatened by global warming-induced sea level rise. At the same time, 60% of the world population lives in a 100 km wide coastal strip (80% within 30 km from the shore in French Brittany).
This is why coastlines are concerned with many issues of various types: economical, ecological, social, political, etc. Coastal areas are natural interfaces between various media (e.g. wind/sea/sand/land). The physical processes acting on these media have very different time scales, hence the need to build complex systems coupling nonlinear partial differential equations and random processes to describe them.
To address these crucial issues, LEMON is an interdisciplinary team working on the design, analysis and application of deterministic and stochastic models for inland and marine littoral processes, with an emphasis on both standalone models and hybrid systems.
The spot of Montpellier offers large opportunities:
The general scope of the LEMON project-team is to develop mathematical and computational methods for the modeling of hydraulic and hydrodynamic processes. The mathematical tools used are deterministic (PDEs, ODEs) and/or probabilistic (extreme value theory). Applications range from regional oceanography to coastal management, including risk assessment for natural hazards on the coastline (submersion and urban floods, tsunamis, pollution).
LEMON is a common research team between HSM (UM, CNRS, IRD), IMAG (UM, CNRS) and Inria, whose faculty members have never been associated to Inria groups in the past. All fellows share a strong background in mathematical modeling, together with a taste for applications to the littoral environment. As reflected in the team contributions, the research conducted by LEMON is interdisciplinary 3, thanks to the team members expertise (deterministic and stochastic modeling, computational and experimental aspects) and to regular collaborations with scientists from other domains. We believe this is both an originality and a strength for LEMON .
Interdisciplinarity is a characteristic and a strength for LEMON. We want to build on this mix by developing two main research axes - physics-driven and data-driven models - applied to free-surface hydraulic processes and their coupling. These two axes will intersect through the hybridization of models and all this work will serve the development of the SW2D-LEMON software so that it remains both an operational easy to use software and a scientific reference of international standard.
Concerning the physics-driven modeling axis, we will continue to work with porosity models, and more generally with upscaling mechanisms for free surface hydraulics. We know since 38 that each upscaled model is biased, which also eventually distorts downscaling operations. We wish to better identify these biases and take them into account in order to improve both the large-scale simulations (development of new models), and the small-scale ones (downscaling using compensation techniques between large-scale models).
The collaboration with University California Irvine (UCI) started in 2014 with research on the representation of urban anisotropic features in integral porosity models 41. It has led to the development of the Dual Integral Porosity model 39. Ongoing research focuses on improved representations of urban anisotropy in urban flood modeling.
Université Catholique de Louvain (UCL) is one of the few places with experimental facilities allowing for the systematic, detailed validation of porosity models. The collaboration with UCL started in 2005 and is still active.
In line with fast changes in the whole society, our scientific community is more and more sensitive to the environmental footprint of research. We already claim that porosity models can be valued for their sobriety, thanks to coarse space meshes and low computational cost simulations. We also wish to develop a time discretization strategy that will continue to lighten our algorithms. A first theoretical work has been carried out for 1D models, we wish to generalize it to 2D models and implement it into operational models.
Discussions have started with team TONUS in Strasbourg, as "CFL-less" methods are also used by the team for kinetic-relaxation approximation 40.
The improvement of realistic flood scenarios also requires the addition of specific processes: we will continue the modeling of interaction with buildings (work initiated by Cécile Choley’s PhD thesis) and develop the transport of log jams in an urban flow, using the functionalities allowed by the concept of porosity to better take into account the feedback of log jams on the flow (crowding process).
Finally, we wish to continue to couple the numerical models developed by the team with other processes: relying on collaborations external to LEMON (as is currently the case with the SURF project of Inria for the Green-Naghdi / shallow water coupling) or recruiting new permanent members, we will use the team's strengths in free-surface hydraulics and in model coupling to explore new fields of application.
One of the originality of LEMON is that we can count on a data-driven component that we wish to develop further. Data are indeed essential throughout the whole modeling/forecast process: providing source terms, bathymetric information, initial and boundary conditions; allowing model hybridization (using data assimilation or artificial intelligence methods); processing model outputs for risk measurements and decision making.
Macroscopic models such as those developed by the team have advantages in terms of computational cost, but also in terms of time saved in the processing of the mesh which does not need to describe complex geometries. However, these models require new parameters that reflect the statistical properties of the domain geometry or the topography/bathymetry of the modeled area. The directional and connectivity properties of the environment have to be estimated from geographical data. In the continuity of Vita Ayoub’s thesis 34, LEMON will work on the development of methods for the automatic estimation of these parameters from cartographic or Earth observation data. The objective will be to set up a methodology for continuous acquisition and automatic fusion of new information in order to improve the mapping of the study area as often as the hydrodynamic models require it.
Understanding the spatial and temporal variability of rainfalls that can generate flash floods is a major challenge. This knowledge is essential to build stochastic methods for simulating scenarios integrating realistic spatiotemporal extreme rainfall fields. This modeling must be done keeping in mind the importance of the physical interpretation of data simulated with such models. We aim to develop, propose, study and implement models adapted to the presence of extreme values taking into account the associated complex dependencies. One difficulty lies in modeling the transitions (in time and space) between no rain, regular rainfall and extreme rainfall. Reproducing spatial or temporal non-stationarity in the intensities as well as in the dependency structure is also a challenge we wish to address.
In the medium term, we want to develop appropriate risk measures that can then be used to assess the potential impacts of extreme rainfall events. Multivariate risk measures should be considered, as flood risk indicators are usually derived by combining different hydraulic variables. We would be interested in the estimation of risk sets, the idea being in the simplest framework to identify all the combinations of water height/velocity values which would lead to a risk higher than a fixed level. More generally, the question of modeling dependence in statistics, and in particular when we consider extremes, is one to which we want to contribute, as is the consideration of compound events.
Finally, our aim is to model forcing terms (rainfall, wind, etc.) for a large number of stations and with a small time scale. In addition, many covariates will be included in the models to better explain the phenomena. This means that we will deal with high dimensional data and with potentially many parameters. This is a limitation in terms of computation time and from a statistical point of view. We will therefore continue to propose methods to reduce the dimension: grouping stations for which the rainfall has a similar behavior (clustering) and highlighting a few significant parameters that are sufficient to explain the model (sparsity).
At the interface between these two main axes, we would like to continue working with hybrid models, in particular thanks to artificial intelligence techniques. Since our recent publication 36, our team is interested in the techniques of physically informed neural networks (PINNs) in fluid mechanics and participates in several working groups on this subject. Keeping in mind that we are not experts on this topic and that the competition is intense, we will explore, notably in Fadil Boodoo’s PhD, the use of AI methods for the simulation of rainfall-flood systems (together with rainfall-discharge and discharge-flood intermediate steps). We would also like to explore the transfer of knowledge from configurations for which the data are numerous and of high quality (digital terrain model accurately known, good quality instrumentation) to more rudimentary computational domains.
To specify and carry out this work program, we hope that LEMON will be able to count on an Inria recruitment in the next 2 or 3 years (several candidates have already expressed interest in the 2023 competition). We will also benefit from data from the Water in the City observatory, structured around the HSM laboratory and led by members of LEMON. The SW2D-LEMON software will of course be at the core of transfers operated by the team: we will continue to devote time of our permanent staff to its development, while willing to integrate this tool into a larger Inria platform in which engineering time (possibly shared with other teams) could be made available in order to enable us to focus on our primary research missions.
The protection of coastal areas around the world has become an important issue of concern, including within the scientific community. The coastline is defined as the physical separation between the sea or ocean on the one hand and the inland on the other, but these two worlds are in fact intertwined, which contributes to the difficulty of their modeling, both from a physical and statistical point of view.
Wave propagation models in the nearshore zone have evolved significantly over the last 15 years, with contributions that increasingly take into account effects related to variations of bathymetry, hence the non-hydrostatic nature of the flow. These models, very specific to the coastal zone, must be able to be coupled (together and with external models) so as to allow wave propagation numerical models to be integrated into numerical forecasting platforms, both in oceanography and in flood risk management.
Due to climate change and rising sea levels, more and more cities are facing the risk of flooding. Whether they are in coastal areas or near rivers, these cities, which are inherently highly artificial and therefore poorly resistant to rising water levels, require different types of numerical models for flood risk: accurate (and potentially costly) models for land use planning, but also fast models, which can be run in real time, for crisis management.
Modeling and risk assessment are at the heart of environmental science. Whether the events considered are of natural or anthropogenic origin, their economic, ecological or human impacts are too important to be neglected. By definition, the more extreme an event is, the lower its frequency of occurrence and therefore the less data available to characterize it. Hence the importance of using statistical tools dedicated to the modeling of extreme events, in order to provide risk management tools that are as safe and effective as possible.
As for all Inria teams, the many calculations we perform (on our personal computers or on dedicated clusters) do have an environmental cost. This cost is linked both to the resources needed to manufacture the machines we use, and to the energy consumed to run them.
LEMON members are aware of the climate emergency and are participating in actions on this subject. For example, Pascal Finaud-Guyot is involved in the "sustainable development and social responsibility" working group at Polytech Montpellier and in "energy footprint reduction" working group at HSM with Carole Delenne. Several members of the team also participate to the local group of Inria Montpellier Antenna dedicated to sustainable development and social responsibility.
Several LEMON members are committed to limit their professional air travel to 10.000km per year.
Our research activities have an indirect impact in terms of environmental responsibility:
2023 was a year of evaluation for our project-team. This evaluation, coordinated by Inria's Evaluation Committee, is an important step in the team's life cycle. In particular, it enabled us to elaborate collectively on the future of all teams involved in the environmental sciences theme. We have produced a summary of this foresight, which raises more questions than it answers, but which reflects the topic's complexity and the debate that has opened up within the Institute itself.
TsunamiLab is an interactive tsunami simulation and visualization platform that teaches and raises awareness about tsunamis through interactive experiences. It allows science communicators, teachers, students and science enthusiasts to create virtual tsunamis or recreate historical tsunamis, and study their features in various digital and augmented reality formats.
TsunamiLab-Pool: Using cameras and projectors, the "pool" format allows children and adults to interact with their own hands, gathered around the circular screen. This allows the instructor to teach and engage several children simultaneously, in a way that is entertaining for all.
Web Platform: The platform's website allows anyone to simulate historical tsunamis, observe how they propagated in the ocean, and test what would have happened if they had been of greater or lesser magnitude.
Hologram: Through a prism, a holographic image makes it possible to observe the impact in different parts of the world at the same time.
Large Touch Screen: Support for large touch screens allows teachers to observe and explain phenomena in an engaging way in front of a group of students.
Cécile Choley's PhD 22 focuses on the modeling of urban floods and the consideration of buildings. Flood management provided by public services is based on two-dimensional numerical models. These models do not or only partially represent the exchanges between streets and buildings. However, feedback and photos report that water enters homes, threatening people and their property. While some buildings act as reservoirs and temporarily or even permanently store part of the volume of the flood, others are crossed by the flows if they are connected to several streets. To characterize the effect of buildings on flooding, a new numerical model is proposed, based on the integration of an additional source term in the 2D shallow water equations. The concept of the street-building model is inspired by compartment models, where street and building exchange a flow through openings, such as doors and windows. The transverse flow is controlled by discharge laws, developed from three-dimensional simulations of real-scale openings. The exchange laws are based on the weir and orifice laws from the literature, and the discharge coefficient is determined by limiting the error on the flow calculated numerically. Laws with a tolerance of 30% error on the discharge passing through an opening are established. The street-building model is applied in a synthetic street.
When modeling large-scale urban floods, the use of porosity non-linear shallow water equations emerges as an interesting sub-grid approach for reducing computation time while preserving the structure of the solution. In such models, fine-scale topographic information is represented at a coarser scale through porosity parameters, enabling a speed-up in computations at the expense of losing accuracy while computing hydrodynamic variables. In 28, we use the Single Porosity model (SP) in Cartesian coordinates to simulate flows in both an idealized and a real-world urban area, while gradually increasing the spatial resolution. During such partial coarsening, in which we move from fine-scale to macro-scale, the porosity distribution changes within the urban zone from a highly heterogeneous field to a more uniform one. At an intermediate meso-scale, where the cell size is of the order of the street width and the reduction in computation time is still significant, the main preferential flow paths within the urban area can be captured by means of the porosity gradient. At such a scale, good agreement with refined classical model solutions is found for flow depth, flood extension, and hazard index, both in magnitude and spatial distribution. Numerical results highlight the importance of porosity models for quickly assessing flow properties during an event and improving real-txme decision-making through reliable information.
Following the work of 35 in
Flood risk is the natural phenomenon that affects the most people in the world, and climate change is likely to increase this trend. Floods, whether in rivers or in urban areas, carry debris that can have a significant impact on hydrodynamics and therefore on risk. Although numerical models are frequently used to anticipate the impacts of floods in order to improve land-use planning and facilitate crisis management, there are few models capable of representing accurately both the flood hydrodynamics and the associated debris transport process. Numerical models, when they exist, are generally based on simplifying assumptions. In 26, we propose a new operational model that is validated and compared with analytical results, other models and experimental data. The proposed model, implemented in the SW2D software, paves the way for a better representation of debris clogging on hydrodynamics, even with a large number of transported objects.
The calculation of wave shoaling and breaking is essential in coastal applications like risk analysis. Due to the high cost of 3D models, 2D vertically averaged models are normally used. For the propagation and shoaling, higher order models are needed to describe the dispersion of waves with enough accuracy. For breaking some dissipation mechanism is normally introduced, of which the simple switch from the dispersive high order models to the first order Saint-Venant equations has been very popular for two reasons: the non parametric dissipation of energy of shock waves and its simple implementation. However, it has been shown that this model becomes unstable as the resolution is increased. In 32 we show how domain decomposition methods can propose a different way to "switch off" the dispersion by exchanging the information through boundary conditions and overlapping zones, obtaining different performances.
In 27, a large CFL algorithm is presented for the explicit, finite volume solution of hyperbolic systems of conservation laws. The Riemann problems used in the flux computation are determined using averaging kernels that extend over several computational cells. The usual Courant-Friedrichs-Lewy stability constraint is replaced with a constraint involving the kernel support size. This makes the method unconditionally stable with respect to the size of the computational cells, allowing the computational mesh to be refined locally to an arbitrary degree without altering solution stability. The practical implementation of the method is detailed for the shallow water equations with topographical source term. Computational examples report applications of the method to the linear advection, Burgers and shallow water equations. In the case of sharp bottom discontinuities, the need for improved, well-balanced discretizations of the geometric source term is acknowledged.
In the recent preprint 24, we adress both novel modeling techniques and practical algorithmic applications. In the first aspect of our contribution, we introduce a new class of models called Asymptotic Independent Block (AI-block) models. These models are based on a model-based approach to variable clustering, where clusters are delineated at the population level based on the independence of the extremes between these clusters. The second aspect of our contribution revolves around the development and rigorous evaluation of an algorithm specifically tailored for AI block models.
In addition, we situate our work in the context of multivariate stationary mixing random processes. To demonstrate the practical utility of our proposed AI block models and the associated algorithm, we present two compelling data analyses. These analyses are performed in neuroscience for the first one and in environmental sciences for the second one. This work has also been presented by Alexis Boulin three times in 2023 21, 31, 6.
Catastrophic climate events such as floods, forest fires and heat waves are often caused by the simultaneous extreme behavior of several interacting processes. Since in these compound events several spatio-temporal factors are jointly extreme and are inherently high dimensional, a proper understanding of them requires the development of dependence summary measures that are appropriate for extreme-value random vectors. The latter is a key component to propose spatial clustering of these temporal processes. Based on the recent development of an algorithm specifically tailored to asymptotic block models (see also 24), we propose in this preprint 25 a clustering method adapted to compound extreme events. We illustrate this method by proposing a regionalization task. Specifically, we identify regions based on gridded data from observations and climate model ensembles over Europe. This approach uses daily precipitation sums and daily maximum wind speed data from from the ERA5 reanalysis dataset from 1979 to 2022. This work was also presented at ICSDS 4, at the JDS12 and at a workshop on data science for coastal risk in Roscoff in November.
The concept of sparse regular variation introduced in 42 allows to infer the tail dependence of a random vector
Count data are omnipresent in many applied fields, often with overdispersion due to an excess of zeroes or extreme values. With mixtures of Poisson distributions representing an elegant and appealing modeling strategy, we focus in the published paper 2 on the study of how extreme value theory can be used in such a context. This work has also been presented in an international conference by the PhD student Samuel Valiquette, see 20.
A flexible multivariate threshold exceedances modeling is defined based on component-wise ratios between any two independent random vectors with exponential and Gamma marginal distributions. This construction allows flexibility in terms of extremal bivariate dependence. More precisely, asymptotic dependence and independence are possible, as well as hybrid situations. Two useful parametric model classes will be presented. One of the two, based on Gamma convolution models, will be illustrated through a simulation study. Good performance is shown for likelihood-based estimation of summaries of bivariate extremal dependence for several scenarii. This work has been presented in an invited seminar in UCL and is submitted in 23.
A lot of work is currently being done in the framework of ANR CROQUIS4 and European project STARWARS5 to collect data of different nature/format on wastewater and stormwater networks. Problematic around these kind of heterogeneous, unprecise, uncertain geographical data have been presented in 3 and an automn school (IA
Concerning water network representation, even if geographic information systems (GIS) are powerful tools for geographical data, each type of elements: i.e. points, lines and polygons, are defined in separate files, so that pipes and manhole covers or devices are disconnected and linked through attributes in the associated database. One approach explored in Omar Et-Targuy's PhD is thus a graph-based approach that ensures connectivity and consistency 8.
Omar Et-Targuy's PhD concerns the practical fusion and heterogeneous conditioning of uncertain data, with an application to urban water data. Conditioning is an important task for updating and revising uncertain information when new information, often considered reliable, is added. This paper deals with the so-called Fagin and Halpern (FH-)conditioning within the framework of possibility theory 37. In two conferences, Omar Et-Targuy presented his first work on the computation of FH-conditioning when it is applied to weighted knowledge bases. He also compared FH-conditioning with the two forms of standard possibilistic conditioning (min-based conditioning and product-based conditioning) 138.
During K. Bakong's internship, supported by IRT Saint-Exupery, we considered downscaling algorithms for shallow water flows thanks to artificial intelligence techniques. Neural networks and boosted trees are used for the simulation of high resolution flow variables computed from low resolution inputs. Various numerical configurations are addressed, with or without using principle component analysis to reduce the computational coast of training and forecasting steps.
This work has been published in 36.
The aim of Fadil Boodoo's PhD is to take advantage of new AI approaches to rapidly generate flood extent maps in view of flood forecasting. In the first part of the PhD, we focused on rainfall-runoff models, via a comparison of classical hydrological and Long Short Term Memory (LSTM) models 19, 29, 30.
The next step will be to generate flood extent maps with another kind of neural network (such as graph neural networks), that will be trained thanks to physical simulation of the flood with SW2D-DDP model.
The overall objective of the research program is to develop a numerical tool able to represent, in urban area, flood and transport (sediment, debris and vehicle) propagation as the potential feedback from transport to the flow. Several directions are identified:
Carole Delenne is a member of the STARWARS European project steering committee: STormwAteR and WastewAteR networkS heterogeneous data AI-driven management (MSCA Staff Exchange program, Grant Number 101086252).
Public and private stakeholders of the wastewater and stormwater sectors are increasingly faced with large quantities and multiple sources of information/data of different nature: databases of factual data, geographical data, various types of images, digital and analogue maps, intervention reports, incomplete and imprecise data (on locations and the geometric features of networks), evolving and conflicting data (from different eras and sources), etc. The main objective of this multidisciplinary project is to provide novel proposals for the management of heterogeneous data with an application to stormwater and wastewater networks. The STARWARS project aims to bring together researchers from the AI and Water Sciences communities in order to enhance the emergence of new practical solutions for representing, managing, modeling, merging, completing, reasoning, explaining and query answering over data of different forms pertaining to stormwater and wastewater networks. The project is implemented through five work packages (WP):
The scheduled secondment plan is designed with the aim of maximizing knowledge transfer and training between the two fields of Water Sciences and AI and thus facilitating the achievement of the project objectives.
Carole Delenne was reviewer for ENIGMA workshop organized next to KR conference in Rhodes. AI-driven heterogeneous data management: Completing, merging, handling inconsistencies and query-answering 2023. Rhodes, Greece, September 3-4, 2023.
5 UM-affiliated members of LEMON are Academics, for a total teaching load of approximately 1000 hrs/year. Moreover, these members undertook significant administrative duties (approx. 1000 hrs) in 2023: