# Functional probabilistic programming for scalable Bayesian modelling

@article{Law2019FunctionalPP, title={Functional probabilistic programming for scalable Bayesian modelling}, author={Jonathan Law and Darren J. Wilkinson}, journal={arXiv: Computation}, year={2019} }

Bayesian inference involves the specification of a statistical model by a statistician or practitioner, with careful thought about what each parameter represents. This results in particularly interpretable models which can be used to explain relationships present in the observed data. Bayesian models are useful when an experiment has only a small number of observations and in applications where transparency of data driven decisions is important. Traditionally, parameter inference in Bayesian… Expand

#### One Citation

Automatic Backward Filtering Forward Guiding for Markov processes and graphical models

- Mathematics, Computer Science
- 2020

This work backpropagate the information provided by observations through the model to transform the generative (forward) model into a pre-conditional model guided by the data, which approximates the actual conditional model with known likelihood-ratio between the two. Expand

#### References

SHOWING 1-10 OF 54 REFERENCES

Stan: A Probabilistic Programming Language

- Computer Science
- 2017

Stan is a probabilistic programming language for specifying statistical models that provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods such as the No-U-Turn sampler and an adaptive form of Hamiltonian Monte Carlo sampling. Expand

Functional programming for modular Bayesian inference

- Computer Science
- Proc. ACM Program. Lang.
- 2018

An architectural design of a library for Bayesian modelling and inference in modern functional programming languages that enables deterministic testing of inherently stochastic Monte Carlo algorithms is presented and it is demonstrated using OCaml that an expressive module system can also implement the design. Expand

Automatic Differentiation Variational Inference

- Computer Science, Mathematics
- J. Mach. Learn. Res.
- 2017

Automatic differentiation variational inference (ADVI) is developed, where the scientist only provides a probabilistic model and a dataset, nothing else, and the algorithm automatically derives an efficient Variational inference algorithm, freeing the scientist to refine and explore many models. Expand

Denotational validation of higher-order Bayesian inference

- Computer Science
- Proc. ACM Program. Lang.
- 2018

A modular semantic account of Bayesian inference algorithms for probabilistic programming languages, as used in data science and machine learning, is presented and Kock's synthetic measure theory is used to emphasize the connection between the semantic manipulation and its traditional measure theoretic origins. Expand

Practical probabilistic programming with monads

- Computer Science
- 2015

This work uses a GADT as an underlying representation of a probability distribution and applies Sequential Monte Carlo-based methods to achieve efficient inference, and defines a formal semantics via measure theory. Expand

Monte Carlo Sampling Methods Using Markov Chains and Their Applications

- Mathematics
- 1970

SUMMARY A generalization of the sampling method introduced by Metropolis et al. (1953) is presented along with an exposition of the relevant theory, techniques of application and methods and… Expand

WinBUGS - A Bayesian modelling framework: Concepts, structure, and extensibility

- Computer Science
- Stat. Comput.
- 2000

How and why various modern computing concepts, such as object-orientation and run-time linking, feature in the software's design are discussed and how the framework may be extended. Expand

Commutative Semantics for Probabilistic Programming

- Computer Science
- ESOP
- 2017

It is shown that probabilistic programs are in fact commutative, by characterizing the measures/kernels that arise from programs as 's-finite', i.e. sums of finite measures/ kernels. Expand

JAGS: A program for analysis of Bayesian graphical models using Gibbs sampling

- Computer Science
- 2003

JAGS is a program for Bayesian Graphical modelling which aims for compatibility with Classic BUGS. The program could eventually be developed as an R package. This article explains the motivations for… Expand

The No-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo

- Computer Science, Mathematics
- J. Mach. Learn. Res.
- 2014

The No-U-Turn Sampler (NUTS), an extension to HMC that eliminates the need to set a number of steps L, and derives a method for adapting the step size parameter {\epsilon} on the fly based on primal-dual averaging. Expand