Probabilistic Graphical Models

Fall 2019 (COMP767-002)

Administrative

Class: Tuesdays and Thursdays 1:05-2:25 PM, ENGMC 320
Office hours: Thursdays 2:30-3:30 PM, ENGMC 325
TA: Harsh Satija
TA office hours: Fridays 4-5 PM, ENGMC 106

Course Description

Graphical models are powerful probabilistic modeling tools. They can model the complex behavior of a large system of interacting variables through local relations specified using a graph. These probabilistic models represent the conditional dependencies between a subsets of variables in a compressed and elegant form. The framework of graphical models has achieved a remarkable success across a variety of domains, from near-optimal codes for communication to the state-of-the-art in combinatorial optimization; these models are widely used in bioinformatics, robotics, vision, natural language processing and machine learning. In this course, we study both directed and undirected models, exact and approximate inference procedures including deterministic optimization-based and stochastic sampling-based methods. We also discuss various approaches to learning model from the data and time permitting discuss deep generative models as instances of graphical models.


Prerequisites:

Basic familiarity with probability theory and algorithm design is required. Assigments need familiarity with Python and Numpy. Background in AI and in particular (applied) machine learning is highly recommended.

Main textbook:

Koller, Daphne, and Nir Friedman. Probabilistic graphical models: principles and techniques. MIT press, 2009.

Further readings:


Course Material:

Assignments, announcements, slides, project descriptions and other course materials are posted on myCourses.

Outline

Representation
  • Syllabus, review of the probability theory (chapters 2)
  • Bayesian Networks (chapter 3)
  • Markov Networks and
    converting directed to/from undirected models (chapter 4)
  • Local and Conditional Probability Models (chapter 5)
  • Gaussian Network Models (chapter 6)
Inference
  • Complexity of Inference and Variable Elimination ( chapter 7)
  • Junction Trees and Belief Propagation (chapter 10)
  • Variational Inference (chapters 8, 11, 13)
    • Exponential Family and Variational Inference
    • Loopy Belief Propagation and Bethe Free Energy
    • Naive Mean-Field
  • Maximum a Posteriori Inference (chapter 13)
  • Sampling Based Inference (chapter 12)
    • Monte Carlo Inference in Graphical Models
    • Markov Chain Monte Carlo
Learning
  • Overview: Objectives in Learning (chapter 16)
  • Maximum likelihood and Bayesian Estimation in Directed Models (chapter 17)
  • Structure learning in Directed Models ( chapter 18)
  • Parameter-Learning in Undirected Models (chapter 20)
  • Learning with Partial Observations (slides, chapters 19)

Academic Integrity

“McGill University values academic integrity. Therefore, all students must understand the meaning and consequences of cheating, plagiarism and other academic offences under the Code of Student Conduct and Disciplinary Procedures” (see www.mcgill.ca/students/srr/honest/ for more information). (Approved by Senate on 29 January 2003)