Practicalities

0.1 Module Aims

Bayesian inference is a set of methods where the probability of an event occurring can be updated as more information becomes available. It is fundamentally different from frequentist methods, which are based on long running relative frequencies. This module gives an introduction to the Bayesian approach to statistical analysis and the theory that underpins it.

Students will be able to explain the distinctive features of Bayesian methodology, understand and appreciate the role of prior distributions and compute posterior distributions. It will cover the derivation of posterior distributions, the construction of prior distributions, and inference for missing data. Extensions are considered to models with more than a single parameter and how these can be used to analyse data. Computational methods have greatly advanced the use of Bayesian methods and this module covers, and allows students to apply, procedures for the sampling and analysis of intractable Bayesian problems.

By the end of the course, students should be able to:

  1. Explain what Bayesian inference is: the role of prior, likelihood, and posterior; how beliefs are updated from data.
  2. Explain how Bayesian reasoning differs from frequentist methods,
  3. Code up some basic Bayesian methods from scratch and implement more complex methods using packages.
  4. Be able to apply Bayesian methods to a range of problems in social sciences.

0.2 Module Outline

19th November, Edinburgh Futures Institute

Time Session
1000–1010 Welcome
1010–1045 Bayesian vs Frequentist Methods
1045–1115 Getting hands on with Bayes theorem
1115–1130 Break
1130–1150 Getting hands on with Bayes theorem
1150–1230 Priors and Prediction
1230–1330 Lunch
1330–1430 Bayesian Hierarchical Models
1430–1445 Break
1445–1530 INLA
1530–1600 Neural Bayes Estimators

0.4 Common Distributions

For many Bayesian inference problems, it is useful to be able to identify probability density functions (for continuous random variables) and probability mass functions (for discrete random variables) up to proportionality. Some common density/mass functions are given below.

Normal distribution \[ \pi(x \mid \mu, \sigma^2) = \frac{1}{\sqrt{2\pi\sigma^2}}\exp\left\{-\frac{1}{2\sigma^2}(x-\mu)^2\right\} \qquad x \in\mathbb{R}, \] where \(\mu \in \mathbb{R}\) and \(\sigma > 0\).

Beta distribution \[ \pi(x\mid \alpha, \beta) = \frac{1}{B(\alpha, \beta)}x^{\alpha-1}(1-x)^{\beta - 1} \qquad x \in [0, 1], \] where \(\alpha, \beta > 0\) and \(B(\alpha, \beta)\) is the beta function.

Gamma distribution \[ \pi(x\mid \alpha, \beta) = \frac{\beta^\alpha}{\Gamma(\alpha)}x^{\alpha - 1}e^{-\beta x} \qquad x > 0, \] where \(\alpha, \beta > 0\) and \(\Gamma(\alpha)\) is the gamma function.

Exponential distribution \[ f(x \mid \lambda) = \lambda e^{-\lambda x} \qquad x > 0, \] where \(\lambda > 0\).

Poisson distribution \[ \pi(x = k \mid \lambda) = \frac{\lambda^k e^{-\lambda}}{k!} \qquad k \in \{1, 2, \ldots\}, \] where \(\lambda > 0\).

Binomial distribution \[ \pi(x = k \mid N, p) = \begin{pmatrix} N \\ k\end{pmatrix} p^k (1-p)^{N-k} \qquad k \in \{1, \ldots, N\} \] where \(p \in [0, 1]\).