This course introduces the fundamentals behind the mathematical and logical framework of graphical models. These models are used in many areas of machine learning and arise in numerous challenging and intriguing problems in data analysis, mathematics, and computer science. For example, the "big data" world frequently uses graphical models to solve problems. While the framework introduced in this course will be largely mathematical, we will also present algorithms and connections to problem domains. The course will begin with the fundamentals of probability theory and will then move into Bayesian networks, undirected graphical models, template-based models, and Gaussian networks. The nature of inference and learning on the graphical structures will be covered, with explorations of complexity, conditioning, clique trees, and optimization. The course will use weekly problem sets and a term project to encourage mastery of the fundamentals of this emerging area.

Course prerequisites: 

Graduate course in probability and statistics (such as 625.403 Statistical Methods and Data Analysis).

Course notes: 

This course is the same as 625.492 Probabilistic Graphical Models.

Course instructor: 
Woolf

View Course Homepage(s) for this course.