Bayesian Statistics for Social Scientists
Semester 1, 2025
Practicalities

0.1 Module Aims
Bayesian inference is a set of methods where the probability of an event occurring can be updated as more information becomes available. It is fundamentally different from frequentist methods, which are based on long running relative frequencies. This module gives an introduction to the Bayesian approach to statistical analysis and the theory that underpins it.
Students will be able to explain the distinctive features of Bayesian methodology, understand and appreciate the role of prior distributions and compute posterior distributions. It will cover the derivation of posterior distributions, the construction of prior distributions, and inference for missing data. Extensions are considered to models with more than a single parameter and how these can be used to analyse data. Computational methods have greatly advanced the use of Bayesian methods and this module covers, and allows students to apply, procedures for the sampling and analysis of intractable Bayesian problems.
By the end of the course, students should be able to:
- Explain what Bayesian inference is: the role of prior, likelihood, and posterior; how beliefs are updated from data.
- Explain how Bayesian reasoning differs from frequentist methods,
- Code up some basic Bayesian methods from scratch and implement more complex methods using packages.
- Be able to apply Bayesian methods to a range of problems in social sciences.
0.2 Module Outline
19th November, Edinburgh Futures Institute
| Time | Session |
|---|---|
| 1000–1010 | Welcome |
| 1010–1045 | Bayesian vs Frequentist Methods |
| 1045–1115 | Getting hands on with Bayes theorem |
| 1115–1130 | Break |
| 1130–1150 | Getting hands on with Bayes theorem |
| 1150–1230 | Priors and Prediction |
| 1230–1330 | Lunch |
| 1330–1430 | Bayesian Hierarchical Models |
| 1430–1445 | Break |
| 1445–1530 | INLA |
| 1530–1600 | Neural Bayes Estimators |
0.3 Recommended Books and Videos
You may find it useful to use other resources in your studies. I recommend the following:
Statistical Rethinking - Richard McElrath. This book provides a friendly intuitive understanding of Bayesian inference and computation. Aimed at social and natural scientists, it has less theory that the other two books but is perhaps more approachable. A set of video lectures for this book can be found on YouTube.
A First Course in Bayesian Statistical Methods - Peter D. Hoff. This is a short book that covers the basics of Bayesian inference and computation. To the point and well written, it’s a useful place to look topics up.
Bayesian Data Analysis - Andrew Gelman, John Carlin, Hal Stern, David Dunson, Aki Vehtari, and Donald Rubin. This is a thorough book explaining everything you’d need to know to carry out Bayesian data analysis. It’s a fairly long and in-depth book, but the authors are authoritative and give good advice throughout. Example code on the website is in R, Python and Stan.
0.4 Common Distributions
For many Bayesian inference problems, it is useful to be able to identify probability density functions (for continuous random variables) and probability mass functions (for discrete random variables) up to proportionality. Some common density/mass functions are given below.
Normal distribution \[ \pi(x \mid \mu, \sigma^2) = \frac{1}{\sqrt{2\pi\sigma^2}}\exp\left\{-\frac{1}{2\sigma^2}(x-\mu)^2\right\} \qquad x \in\mathbb{R}, \] where \(\mu \in \mathbb{R}\) and \(\sigma > 0\).
Beta distribution \[ \pi(x\mid \alpha, \beta) = \frac{1}{B(\alpha, \beta)}x^{\alpha-1}(1-x)^{\beta - 1} \qquad x \in [0, 1], \] where \(\alpha, \beta > 0\) and \(B(\alpha, \beta)\) is the beta function.
Gamma distribution \[ \pi(x\mid \alpha, \beta) = \frac{\beta^\alpha}{\Gamma(\alpha)}x^{\alpha - 1}e^{-\beta x} \qquad x > 0, \] where \(\alpha, \beta > 0\) and \(\Gamma(\alpha)\) is the gamma function.
Exponential distribution \[ f(x \mid \lambda) = \lambda e^{-\lambda x} \qquad x > 0, \] where \(\lambda > 0\).
Poisson distribution \[ \pi(x = k \mid \lambda) = \frac{\lambda^k e^{-\lambda}}{k!} \qquad k \in \{1, 2, \ldots\}, \] where \(\lambda > 0\).
Binomial distribution \[ \pi(x = k \mid N, p) = \begin{pmatrix} N \\ k\end{pmatrix} p^k (1-p)^{N-k} \qquad k \in \{1, \ldots, N\} \] where \(p \in [0, 1]\).