Lecture 1 - Recap of Probability Theory

Lecture 2 - Why are we interested in Joint Distributions

Lecture 3 - How do we represent a joint distribution

Lecture 4 - Can we represent the joint distribution more compactly

Lecture 5 - Can we use a graph to represent a joint distribution

Lecture 6 - Different types of reasoning encoded in a Bayesian Network

Lecture 7 - Independencies encoded by a Bayesian Network (Case 1: Node and it's parents)

Lecture 8 - Independencies encoded by a Bayesian Network (Case 2: Node and it's non-parents)

Lecture 9 - Independencies encoded by a Bayesian Network (Case 3: Node and it's descendants)

Lecture 10 - Bayesian Networks : Formal Semantics

Lecture 11 - I-Maps

Lecture 12 - Markov Networks: Motivation

Lecture 13 - Factors in Markov Network

Lecture 14 - Local Independencies in a Markov Network

Lecture 15 - Joint Distributions

Lecture 16 - The concept of a latent variable

Lecture 17 - Restricted Boltzmann Machines

Lecture 18 - RBMs as Stochastic Neural Networks

Lecture 19 - Unsupervised Learning with RBMs

Lecture 20 - Computing the gradient of the log likelihood

Lecture 21 - Motivation for Sampling

Lecture 22 - Motivation for Sampling - Part 2

Lecture 23 - Markov Chains

Lecture 24 - Why de we care about Markov Chains ?

Lecture 25 - Setting up a Markov Chain for RBMs

Lecture 26 - Training RBMs Using Gibbs Sampling

Lecture 27 - Training RBMS Using Contrastive Divergence

Lecture 28 - Revisiting Autoencoders

Lecture 29 - Variational Autoencoders: The Neural Network Perspective

Lecture 30 - Variational Autoencoders: The Graphical model perspective

Lecture 31 - Neural Autoregressive Density Estimator

Lecture 32 - Masked Autoencoder Density Estimator (MADE)

Lecture 33 - Generative Adversarial Networks - The Intuition

Lecture 34 - Generative Adversarial Networks - Architecture

Lecture 35 - Generative Adversarial Networks - The Math Behind it

Lecture 36 - Generative Adversarial Networks - Some Cool Stuff and Applications

Lecture 37 - Bringing it all together (the deep generative summary)