beanmachine.ppl.inference.proposer.hmc_proposer module

class beanmachine.ppl.inference.proposer.hmc_proposer.HMCProposer(initial_world: beanmachine.ppl.world.world.World, target_rvs: Set[beanmachine.ppl.model.rv_identifier.RVIdentifier], num_adaptive_samples: int, trajectory_length: float, initial_step_size: float = 1.0, adapt_step_size: bool = True, adapt_mass_matrix: bool = True, full_mass_matrix: bool = False, target_accept_prob: float = 0.8, jit_backend: beanmachine.ppl.experimental.torch_jit_backend.TorchJITBackend = TorchJITBackend.NNC)

Bases: beanmachine.ppl.inference.proposer.base_proposer.BaseProposer

The basic Hamiltonian Monte Carlo (HMC) algorithm as described in [1] plus a dual-averaging mechanism for dynamically adjusting the step size [2].

Reference:
[1] Radford Neal. “MCMC Using Hamiltonian Dynamics” (2011).

https://arxiv.org/abs/1206.1901

[2] Matthew Hoffman and Andrew Gelman. “The No-U-Turn Sampler: Adaptively

Setting Path Lengths in Hamiltonian Monte Carlo” (2014). https://arxiv.org/abs/1111.4246

The current implementation does not use nor adapt a mass matrix – which is equivalent to setting the matrix M to I.

Parameters
  • initial_world – Initial world to propose from.

  • target_rvs – Set of RVIdentifiers to indicate which variables to propose.

  • num_adaptive_samples – Number of adaptive samples to run.

  • trajectory_length – Length of single trajectory.

  • initial_step_size – Initial step size.

  • adapt_step_size – Flag whether to adapt step size, defaults to True.

  • adapt_mass_matrix – Flat whether to adapt mass matrix, defaults to True.

  • target_accept_prob – Target accept prob, defaults to 0.8.

  • nnc_compile – If True, NNC compiler will be used to accelerate the inference.

do_adaptation(*args, **kwargs) None
finish_adaptation() None
propose(world: beanmachine.ppl.world.world.World) Tuple[beanmachine.ppl.world.world.World, torch.Tensor]