I have on several occasions wondered how one might proceed in order to sample large subcritical Bernoulli bond-percolation clusters, say on the square lattice.
More precisely, let's consider the lattice Z2, and open each edge independently with probability p<pc=0.5. I am interested in the event that the vertices (0,0) and (N,0) are connected by a path of open edges, when N some large number ("large" meaning here "many times the correlation length").
It is well-known (rigorously) that this probability decays exponentially fast with N, and one has a lot of information about the geometry of the corresponding cluster (e.g., the corresponding cluster converges to a brownian bridge as N→∞ under diffusive scaling, it has a maximal "width" of order logN, etc.).
Question: Are there (not too inefficient) algorithms that would sample typical clusters contributing to this event?
Edit: googling a bit, I have stumbled upon this paper. I have yet to read it (and to decide whether it is relevant to my question). What the author seems to be doing is growing his objects in a biased way, which helps create the very unlikely configurations he's after, but would result in the wrong final distribution. So he's at the same time correcting the bias (in a way I haven't yet understood by just browsing through the paper).
As an example, he's sampling 18 disjoint crossing clusters for critical percolation on Z3 in a box of size 128×128×2000, an event of probability of order
10−300. So:
Alternative question: does anybody know how this works? (I can also simply read the paper, but it might be useful for me and others if a nice self-contained description was given here by someone knowledgeable).
Apparently, looking more thoroughly, there is a lot of material that could be useful. It seems that relevant keywords here are: rare events simulation, importance sampling, splitting, etc.
This post has been migrated from (A51.SE)