Shared technologies for all codes

Codes presentation

DassFlow is the code simulating river dynamics either based on the 2D Shallow Water system (2D SW), DassFlow 2D code, or on the 1D Saint-Venant system (1D SW), DassFlow1D code.
SMASH is the code simulating the hydrology (spatially distributed model) at watershed scale.
DassHydro is the code integrating both codes, which are weakly coupled in the way SMASH - DassFlow 2D.

Codes languages and architecture

  • The three codes ( DassFlow2D , DassFlow1D and SMASH ) have been built up following the same spirit and technologies.
    The computational kernel codes are written in Fortran 90 with parallel capabilities for DassFlow2D and SMASH (useless feature for DassFlow1D).

  • The Fortran kernel codes are wrapped in Python using f90wrap [Kermode, 2020].
    This enables to use numerous Python libraries on top of the physics-based computations such as pre or post-processing of the computed fields or data, but also Machine Learning, wavelet basis etc.
    UQ libraries such as OpenTurns can be easily jointly employed too.
    Linear systems are solved using MUMPS or NumPy libraries.

  • Each code version includes a forward code, the corresponding adjoint code generated by performing the automatic differentiation tool Tapenade [Hascoet and Pascual, 2013].

  • Each code version includes various optimization algorithms and estimation approaches (e.g. Bayesian analysis).
    The typical optimization algorithms are (non exhaustive list):

    • Limited-memory Broyden-Fletcher-Goldfarb-Shanno Bounded (L-BFGS-B): Quasi-Newton methods for bounded optimization [Zhu et al., 1994].
    • Adaptive Moment Estimation (ADAM): Adaptive learning rates with momentum for fast convergence [Kingma and Ba, 2014].
  • The complete optimization processes provide parameters identification, reduced uncertainties, calibrated model(s).

  • VDA formulations are based on priors: optimization starting points and covariances operators (probabilistic priors).

  • Classical and original covariances operators can be considered.

  • Different regularization terms can be considered.

  • The estimation strategies can partially rely on Machine Learning techniques by employing e.g. Feed-Forward Neural Networks, LSTM (PyTorch library).

  • Each code is interfaced with few pre and post-processors enabling to tackle large scale real-world datasets: GIS libraries, mesh generators, visualization tools.

Keywords: Python-wrapped computational codes; Fortran computational kernels (utilizing MPI for parallelization); adjoint codes; optimization libraries; pre and post-processing Python interfaces; mesh procedures for models coupling.