Abstract:
Improving Importance Sampling estimators for rare event probabilities requires sharp approx- imations of the optimal density leading to a nearly zero-variance estimator. This paper presents a new way to handle the estimation of the probability of a rare event defined as a finite intersection of subset. We provide a sharp approximation of the density of long runs of a random walk condi- tioned by multiples constraints, each of them defined by an average of a function of its summands as their number tends to infinity.

Abstract:
Exploiting stochastic path integral theory, we obtain \emph{by simulation} substantial gains in efficiency for the computation of reaction rates in one-dimensional, bistable, overdamped stochastic systems. Using a well-defined measure of efficiency, we compare implementations of ``Dynamic Importance Sampling'' (DIMS) methods to unbiased simulation. The best DIMS algorithms are shown to increase efficiency by factors of approximately 20 for a $5 k_B T$ barrier height and 300 for $9 k_B T$, compared to unbiased simulation. The gains result from close emulation of natural (unbiased), instanton-like crossing events with artificially decreased waiting times between events that are corrected for in rate calculations. The artificial crossing events are generated using the closed-form solution to the most probable crossing event described by the Onsager-Machlup action. While the best biasing methods require the second derivative of the potential (resulting from the ``Jacobian'' term in the action, which is discussed at length), algorithms employing solely the first derivative do nearly as well. We discuss the importance of one-dimensional models to larger systems, and suggest extensions to higher-dimensional systems.

Abstract:
Rare event simulation and estimation for systems in equilibrium are among the most challenging topics in molecular dynamics. As was shown by Jarzynski and others, nonequilibrium forcing can theoretically be used to obtain equilibrium rare event statistics. The advantage seems to be that the external force can speed up the sampling of the rare events by biasing the equilibrium distribution towards a distribution under which the rare events is no longer rare. Yet algorithmic methods based on Jarzynski's and related results often fail to be efficient because they are based on sampling in path space. We present a new method that replaces the path sampling problem by minimization of a cross-entropy-like functional which boils down to finding the optimal nonequilibrium forcing. We show how to solve the related optimization problem in an efficient way by using an iterative strategy based on milestoning.

Abstract:
Importance sampling has been reported to produce algorithms with excellent empirical performance in counting problems. However, the theoretical support for its efficiency in these applications has been very limited. In this paper, we propose a methodology that can be used to design efficient importance sampling algorithms for counting and test their efficiency rigorously. We apply our techniques after transforming the problem into a rare-event simulation problem--thereby connecting complexity analysis of counting problems with efficiency in the context of rare-event simulation. As an illustration of our approach, we consider the problem of counting the number of binary tables with fixed column and row sums, $c_j$'s and $r_i$'s, respectively, and total marginal sums $d=\sum_jc_j$. Assuming that $\max_jc_j=o(d^{1/2})$, $\sum c_j^2=O(d)$ and the $r_j$'s are bounded, we show that a suitable importance sampling algorithm, proposed by Chen et al. [J. Amer. Statist. Assoc. 100 (2005) 109--120], requires $O(d^3\varepsilon^{-2}\delta^{-1})$ operations to produce an estimate that has $\varepsilon$-relative error with probability $1-\delta$. In addition, if $\max_jc_j=o(d^{1/4-\delta_0})$ for some $\delta_0>0$, the same coverage can be guaranteed with $O(d^3\varepsilon^{-2}\log(\delta^{-1}))$ operations.

Abstract:
Rare events are events that are expected to occur infrequently, or more technically, those that have low probabilities (say, order of $10^{-3}$ or less) of occurring according to a probability model. In the context of uncertainty quantification, the rare events often correspond to failure of systems designed for high reliability, meaning that the system performance fails to meet some design or operation specifications. As reviewed in this section, computation of such rare-event probabilities is challenging. Analytical solutions are usually not available for non-trivial problems and standard Monte Carlo simulation is computationally inefficient. Therefore, much research effort has focused on developing advanced stochastic simulation methods that are more efficient. In this section, we address the problem of estimating rare-event probabilities by Monte Carlo simulation, Importance Sampling and Subset Simulation for highly reliable dynamic systems.

Abstract:
A sampling procedure for the transition matrix Monte Carlo method is introduced that generates the density of states function over a wide parameter range with minimal coding effort.

Abstract:
Importance sampling is a variance reduction technique for efficient estimation of rare-event probabilities by Monte Carlo. In standard importance sampling schemes, the system is simulated using an a priori fixed change of measure suggested by a large deviation lower bound analysis. Recent work, however, has suggested that such schemes do not work well in many situations. In this paper we consider dynamic importance sampling in the setting of uniformly recurrent Markov chains. By ``dynamic'' we mean that in the course of a single simulation, the change of measure can depend on the outcome of the simulation up till that time. Based on a control-theoretic approach to large deviations, the existence of asymptotically optimal dynamic schemes is demonstrated in great generality. The implementation of the dynamic schemes is carried out with the help of a limiting Bellman equation. Numerical examples are presented to contrast the dynamic and standard schemes.

Abstract:
The efficient calculation of rare-event kinetics in complex dynamical systems, such as the rate and pathways of ligand dissociation from a protein, is a generally unsolved problem. Markov state models can systematically integrate ensembles of short simulations and thus effectively parallelize the computational effort, but the rare events of interest still need to be spontaneously sampled in the data. Enhanced sampling approaches, such as parallel tempering or umbrella sampling, can accelerate the computation of equilibrium expectations massively - but sacrifice the ability to compute dynamical expectations. In this work we establish a principle to combine knowledge of the equilibrium distribution with kinetics from fast "downhill" relaxation trajectories using reversible Markov models. This approach is general as it does not invoke any specific dynamical model, and can provide accurate estimates of the rare event kinetics. Large gains in sampling efficiency can be achieved whenever one direction of the process occurs more rapid than its reverse, making the approach especially attractive for downhill processes such as folding and binding in biomolecules.

Abstract:
The efficient importance sampling (EIS) method is a general principle for the numerical evaluation of high-dimensional integrals that uses the sequential structure of target integrands to build variance minimising importance samplers. Despite a number of successful applications in high dimensions, it is well known that importance sampling strategies are subject to an exponential growth in variance as the dimension of the integration increases. We solve this problem by recognising that the EIS framework has an offline sequential Monte Carlo interpretation. The particle EIS method is based on non-standard resampling weights that take into account the look-ahead construction of the importance sampler. We apply the method for a range of univariate and bivariate stochastic volatility specifications. We also develop a new application of the EIS approach to state space models with Student's t state innovations. Our results show that the particle EIS method strongly outperforms both the standard EIS method and particle filters for likelihood evaluation in high dimensions. Moreover, the ratio between the variances of the particle EIS and particle filter methods remains stable as the time series dimension increases. We illustrate the efficiency of the method for Bayesian inference using the particle marginal Metropolis-Hastings and importance sampling squared algorithms.

Abstract:
Rare events are ubiquitous in many different fields, yet they are notoriously difficult to simulate because few, if any, events are observed in a conventiona l simulation run. Over the past several decades, specialised simulation methods have been developed to overcome this problem. We review one recently-developed class of such methods, known as Forward Flux Sampling. Forward Flux Sampling uses a series of interfaces between the initial and final states to calculate rate constants and generate transition paths, for rare events in equilibrium or nonequilibrium systems with stochastic dynamics. This review draws together a number of recent advances, summarizes several applications of the method and highlights challenges that remain to be overcome.