simon jackman | |

bayesian analysis for the social sciences |

IPSA Summer School, Sao Paulo, February 2012

This class provides a practical introduction to Bayesian statistical inference, with an emphasis on applications in the social sciences. We will begin slowly, with a consideration of how Bayesian statistical inference differs from classical or frequentist inference. We will examine these differences in the context of simple statistical procedures and models: e.g., one and two-sample t-tests, the analysis of a two-by-two tables, one-way ANOVA, regression.

We then show how the explosion in desktop computing power since the 1990s has made Bayesian approaches attractive for more complex models. Specifically, the set of algorithms known as Markov chain Monte Carlo (MCMC) allows researchers to tackle classes of problems that used to fall in the ``too hard'' basket.

Today, MCMC is well and truly part of the statistical computing toolkit available to social scientists, and implemented in various forms in many different software packages (we will survey some of these, see below). We will examine how these algorithms make Bayesian inference feasible, their strengths and weaknesses, and some of the pitfalls to avoid when deploying MCMC algorithms.

Text

We will use my text, Bayesian Analysis for the Social Sciences (Wiley, 2009; Amazon).

Please consult the errata (last changed on May 23 2013) and let me know if you see any more errors, typos etc.

Class material:

- Syllabus

(time last changed: 9:17 pm January 8 2012) - Introduction, Chapter 1

(time last changed: 3:05 pm February 4 2012) - Conjugate Bayesian analysis

(time last changed: 5:21 am February 11 2012) - Bayesian Regression Analysis
- The Monte Carlo Principle

(time last changed: 6:09 pm February 3 2012) - Markov chain Monte Carlo

(time last changed: 1:18 am February 8 2012) - Practical Markov chain Monte Carlo

(time last changed: 6:13 pm February 3 2012) - Hierarchical Models

(time last changed: 10:10 am February 11 2012) - Discrete Choice Data

(time last changed: 6:13 pm February 3 2012) - Bayesian Inference for Latent States

(time last changed: 6:13 pm February 3 2012)

Worked Examples

- The rjags R package
- Brazilian legislative roll call data, analysis in ideal: R code Rdata object
- Monte Carlo analysis, difference of two binomial proportions: R; JAGS
- Regression; with posterior predictive density (election fraud in Pennsylvania state senate election): R; JAGS
- Multivariate t regression for compositional data (Example 6.9; UK House of Commons):
- Blocking example (Example 6.3): R; JAGS (faster/blocked) JAGS (slower/unblocked)
- Missing data example (poking holes in the electoral fraud data set): R, JAGS.
- One-way Anova, presidential elections data: R JAGS
- One-way hierarchical model (High school and Beyond data): data set 1, data set 2, data reader R, set for JAGS R, JAGS.
- Iraq war, use of force authorization in the U.S. Senate: R JAGS
- Rock the Vote analysis R JAGS
- Ordinal logistic model (political information) R JAGS
- Ordinal logistic model, with hierarchical structure ("random effects" for interviewers) JAGS
- Factor analysis, social attributes of the 50 American states R JAGS
- Hierarchical version of factor analysis model, §9.2.5 JAGS
- Analysis of roll calls from the 110th U.S. Senate PDF
- Hierarchical version of the roll call model (Example 9.2) JAGS