One way to bridge the scale between full atomistic models and more coarse-grained descriptions is to use Markov State models parameterized by the Eyring Kramers formulas. These formulas give the hopping rates between local minima of the potential energy function. They require to identify the local minima and saddle points of the potential energy function. This approach is for example used in...

I will discuss two methods to coarse-grain and predict atomic kinetics generated by molecular dynamics, with application to diffusion and plasticity in metals. When the energy landscape is metastable, atomic kinetics can be mapped to a discrete Markov chain with robust Bayesian bounds on unseen transitions. These bounds are used to allocate resources in massively parallel computation and...

As a meta-algorithm, population annealing can be combined with a wide range of simulation methods, including Monte Carlo and molecular dynamics. In the past, we have analyzed the approach regarding the scaling of statistical and systematic errors, proposed improvements and implemented the method on highly-efficient graphics processing units. In the present talk I will discuss recent...

In this talk, I will introduce sampling issues in glassy disordered systems, particularly glass-forming liquids, which consist of a long-standing problem in condensed matter physics. I will explain why this is important and difficult, and I will review various previous attempts.

In probability theory, the notion of "weak convergence" is often used to describe two equivalent probability distributions. This metric requires equivalence of the average value of well-behaved functions under the two probability distributions being compared. In coarse-grained modeling, Noid and Voth developed a thermodynamic equivalence principle that has a similar requirement. Nevertheless,...

Energy-based models (EBMs) are powerful generative machine learning models that are able to encode the complex distribution of a dataset in the Gibbs-Boltzmann distribution of a model energy function. This means that, if properly trained, they can be used to synthesize new patterns that resemble those of the data set as closely as possible, but also that this energy function can be used to...

Deep generative models parametrize very flexible families of distributions able to fit complicated datasets of images or text. These models provide independent samples from complex high-distributions at negligible costs. On the other hand, sampling exactly a target distribution, such a Bayesian posterior, is typically challenging: either because of dimensionality, multi-modality,...

The Koopman Operator presents a powerful framework for dimensionality reduction of (stochastic) dynamical systems. In addition, metastable sets and their rates of transition can be obtained by analysing its spectrum. In this talk, we first recap the basics of Koopman methods, and then move on to discuss recent advances and current challenges.

The study of chemical reactions involving covalent links and breaks inherently presents stochastic complexities. To overcome the challenges posed by large energy barriers, researchers commonly use enhanced sampling techniques like transition path sampling and umbrella sampling to collect relevant data at the DFT level. However, the data acquired through these processes requires careful...

(joint work with Luca Nenna)

In this talk, we will present recent mathematical results about the Lieb functional in Density Functional Theory. More precisely, the Lieb functional, for a given electronic density, can be viewed as a generalized form of optimal transport problem for which the electronic density plays the role of a marginal. A numerical discretization of this problem can be...

This talk introduces Barrier Hamiltonian Monte Carlo (BHMC), a version of HMC which aims at sampling from a Gibbs distribution π on a manifold M, endowed with a Hessian metric g derived from a self-concordant barrier. Like Riemannian Manifold HMC, our method relies on Hamiltonian dynamics which comprise g. It incorporates the constraints defining M and is therefore able to exploit its...

A significant challenge faced by atomistic simulations is the difficulty, and often impossibility, to sample the transitions between metastable states of the free-energy landscape associated to slow molecular processes. Importance-sampling schemes represent an appealing option to accelerate the underlying dynamics by smoothing out the relevant free-energy barriers, but require the definition...

I will discuss recent work on unifying flow-based and diffusion based methods through a

generative modeling paradigm we call stochastic interpolants. These models enable the use of a broad class of continuous-time stochastic processes called `stochastic interpolants' to bridge any two arbitrary probability density functions exactly in finite time. These interpolants are built by combining...

Breaking reversibility in Monte Carlo algorithms often leads to substantial accelerations in sampling complex systems. Event-Chain Monte Carlo (ECMC) has allowed to investigate the bidimensional hard-sphere phase transition, building on non-reversible continuous translational moves. However, more general systems require rotations of some sort to thermalize.

In this work, we build on the...

Sampling the Boltzmann distribution using forces that violate detailed balance can be faster than with the equilibrium evolution, but the acceleration depends on the nature of the nonequilibrium drive and the physical situation. Here, we study the efficiency of forces transverse to energy gradients in dense liquids through a combination of techniques: Brownian dynamics simulations, exact...

Overdamped Langevin dynamics are stochastic differential equations, where gradient dynamics are perturbed by noise in order to sample high dimensional probability measures such as the ones appearing in computational statistical physics and Bayesian inference. By varying the diffusion coefficient, there are in fact infinitely many overdamped Langevin dynamics which preserve the target...

Modifying or biasing the dynamics of atomistic systems can result in faster mixing and convergence of thermodynamic observables, but it generally yields non-physical kinetics. I will introduce a family of so called "Accelerated Molecular Dynamics" methods that are specifically designed to produce statistically accurate "state-to-state" dynamics for metastable systems at a much reduced...

Coupling from the past is a method for obtaining perfect samples from Markov chain Monte Carlo algorithms. The price paid is that the running time becomes random. We will present some recent results concerning the limit behaviour of this random time, and discuss a number of open conjectures.

How to parallelize computation and how to diagnose convergence remain largely open questions regarding MCMC. Since Glynn & Rhee (Journal of Applied Probability, Vol. 51A, 2014), various advances based on couplings of MCMC algorithms have been proposed. The key is the design of coupled chains that, if properly constructed, can be employed to construct estimators that do not suffer from the...

Rapid cooling or heating of a physical system can lead to unusual thermal relaxation phenomena. A prime example of anomalous thermal relaxation is the Mpemba effect. The phenomenon occurs when a system prepared at a hot temperature overtakes an identical system prepared at a warm temperature and equilibrates faster to the cold environment. A similar effect exists in heating. Comparing two...

Rare events are of primarily importance for understanding the impact of climate change. The first class are extreme events which have devastating impacts; the second are rare trajectories which lead to bifurcations and drastic changes of the climate system configurations and tipping points. However, because those events are too rare and realistic models are too complex, they cannot be computed...

Constraint Satisfaction problems (CSPs) deal with finding a solution to a set of variables that satisfy a set of constraints. In the last decade, it has been found that many CSPs can have different levels of computational hardness when the number of constraints is changed. The same issue arises in inference problems in the so-called planted setting, where a planted configuration always exists...

The elastic properties of tungsten, a ubiquitous material in future energy systems, are investigated up to its melting temperature by means of a data-driven approach. The proposed workflow combines machine learning of the force field and enhanced sampling of the crystalline structure. While the machine learning force field achieves the accuracy of ab initio calculations, its implementation in...

A goal of unsupervised machine learning is to build representations of complex high-dimensional data, with simple relations to their properties. Such disentangled representations make it easier to interpret the significant latent factors of variation in the data, as well as to generate new data with desirable features. The methods for disentangling representations often rely on an adversarial...

Generative models aim to learn the empirical distribution of a given data set in order to build a probabilistic model capable of generating new samples that are statistically similar to the data set. One can also assume that one can obtain an approximately tractable analytical description of this distribution.

In this presentation, I will specifically consider the case of the so-called...

Reconstructing, or generating, high dimensional probability distributions starting from data is a central problem in machine learning and data sciences.

We will present a method —The Wavelet Conditional Renormalization Group —that combines ideas from physics (renormalization group theory) and computer science (wavelets, Monte-Carlo sampling, etc.). The Wavelet Conditional Renormalization...

Normalizing Flows (NF) are Generative models which transform a simple prior distribution into the desired target. They however require the design of an invertible mapping whose Jacobian determinant has to be computable. Recently introduced, Neural Hamiltonian Flows (NHF) are Hamiltonian dynamics-based flows, which are continuous, volume-preserving and invertible and thus make for natural...

In many systems exceptional events can have a crucial impact, while the routine is peaceful and with no consequences. Well known examples are the earthquakes in the Earth's lithosphere or the events of extreme weather. Predicting their magnitude or their occurrence rate is a major challenge for human security and economy. Large deviation theory is the branch of probabilities that adresses...

Run-and-tumble particles are a paradigmatic model in out-of-equilibrium physics that exhibits interesting phenomena not found in their passive counterparts such as motility-induced phase separation. I will present the long-time behavior of a pair of such particles with hard-core interactions on a unidimensional torus and on a line by casting them as a piecewise deterministic Markov process. I...

Piecewise deterministic Markov processes (PDMPs) received substantial interest in recent years as an alternative to classical Markov chain Monte Carlo algorithms. While theoretical properties of PDMPs have been studied extensively, their practical implementation remains limited to specific applications in which bounds on the gradient of the negative log-target can be derived. In order to...