Single-Site Ancestral Metropolis-Hastings
Ancestral Metropolis-Hastings is one of the most fundamental Bayesian inference methods. In ancestral Metropolis-Hastings, values are sampled from the model's priors, and samples are accepted or rejected based on the sample's Metropolis acceptance probability. As such, ancestral Metropolis-Hastings is a very general inference method, making no strong assumptions about the structure of the model. However, this generality may lead it to be rather inefficient for many models.
Bean Machine provides a single-site variant of ancestral Metropolis-Hastings, in which values of random variables are sampled and updated one variable at a time (hence the name "single-site").
Algorithm
Imagine we are using Single-Site Ancestral Metropolis-Hastings to choose a new value for a random variable . Let's assume that is currently assigned a value . Below are the steps to this algorithm:
- First, we need to propose a new value for . We do this by sampling from 's prior distribution, and using that value as the new proposed value for . Let's call that sampled value .
- Next, we need to identify other random variables that may have a direct influence upon. This set of random variables is referred to as the Markov blanket of . The Markov blanket of a random variable consists of the random variable's parents (those that it depends upon), children (those that depend upon it), and the other parents of the random variable's children. We only need to consider the Markov blanket of random variable when assessing appropriateness, because only the likelihoods of these distributions are directly affected by a change in the value of . All other random variables in the model are conditionally independent of given the random variables in 's Markov blanket.
- Now, we need to assess whether this sample is appropriate. We will examine the likelihood of , conditional on the other variables in its Markov blanket. We can do this computationally by computing the (log) likelihoods of those other random variables when .
- Finally, we compare the (log) likelihoods when and . We use the Metropolis acceptance probability to accept that tend to have relatively higher (log) likelihoods. The exact acceptance probability can be read about in the linked article, or in the algorithm details below.
Details
This is the standard ancestral Metropolis-Hastings algorithm:
Or, in pseudo-code:
For each inference iteration:
For each unobserved random variable X:
Perform a Metropolis Hastings (MH) update, which involves:
1. Propose a new value x′ for X using proposal Q
2. Update the world σ to σ′
3. Accept / reject the new value x' using Metropolis acceptance probability
Usage
The following code snippet illustrates how to use the inference method.
samples = bm.SingleSiteAncestralMetropolisHastings().infer(
queries,
observations,
num_samples,
num_chains,
)
The parameters to infer
are described below:
Name | Usage |
---|---|
queries | A List of @bm.random_variable targets to fit posterior distributions for. |
observations | The Dict of observations. Each key is a random variable, and its value is the observed value for that random variable. |
num_samples | Number of samples to build up distributions for the values listed in queries . |
num_chains | Number of separate inference runs to use. Multiple chains can be used by diagnostics to verify inference ran correctly. |