In Bayesian statistics, inference about a population parameter or hypothesis is achieved by merging prior knowledge, represented as a prior probability distribution, with data. This prior distribution and data are merged mathematically using Bayes' rule to produce a posterior distribution, and this course focuses on the ways in which the posterior distribution is used in practice and on the details of how the calculation of the posterior is done. In this course, we discuss specific types of prior and posterior distributions, prior/posterior conjugate pairs, decision theory, Bayesian prediction, Bayesian parameter estimation and estimation uncertainty, and Monte Carlo methods commonly used in Bayesian statistical inference. Students will apply Bayesian methods to analyze and interpret several real-world data sets and will investigate some of the theoretical issues underlying Bayesian statistical analysis. R is the software that will be used to illustrate the concepts discussed in class. (Note: Prior experience with R is not required; students not familiar with R will be directed to an online tutorial.)

Course prerequisites: 

Multivariate calculus, familiarity with basic matrix algebra, and a graduate course in probability and statistics (such as 625.403).

Course instructor: 
Botts