Lecture 3 - Spectrum of Transmitted Digital Communication Signal, Autocorrelation Function and Power Spectral Density

Lecture 4 - Spectrum of Transmitted Digital Communication Signal, Relation to Energy Spectral Density and Introduction to AWGN Channel

Lecture 6 - Structure of Digital Communication Receiver, Receiver Filter and Signal-to-Noise Power Ratio (SNR)

Lecture 9 - Probability of Error in Digital Communication and Probability Density Functions of Output

Lecture 10 - Probability of Error in Digital Communication, Optimal Decison Rule and Gaussian Q function

Lecture 11 - Introduction to Binary Phase Shift Keying (BPSK) Modulation, Optimal Decision Rule and Probability of Bit-Error or Bit-Error Rate (BER)

Lecture 13 - Optimal Decision Rule for Amplitude Shift Keying (ASK), Bit Error Rate (BER) and Comparison with Binary Phase Shift Keying (BPSK) Modulation

Lecture 19 - Matched Filtering, Bit Error Rate and Symbol Error Rate for Quadrature Phase Shift Keying (QPSK)

Lecture 20 - Introduction to M-ary PAM (Pulse Amplitude Modulation), Average Symbol Power and Decision rules

Lecture 21 - M-ary PAM (Pulse Amplitude Modulation) -Part-II, Optimal Decision Rule and Probability of Error

Lecture 22 - M-ary QAM (Quadrature Amplitude Modulation) Part-I, Introduction, Transmitted Waveform and Average Symbol Energy

Lecture 23 - M-ary QAM (Quadrature Amplitude Modulation) - Part-II, Optimal Decision Rule, Probability of Error and Contellation Diagram

Lecture 24 - M-ary PSK (Phase Shift Keying) Part-I, Introduction , Transmitted Waveform and Constellation Diagram

Lecture 25 - M-ary PSK (Phase Shift Keying) - Part-II, Optimal Decision Rule, Nearest Neighbor Criterion and Approximate Probability of Error

Lecture 26 - Introduction to Information Theory, Relevance of Information Theory and Characterization of Information

Lecture 27 - Definition of Entropy, Average of Information / Uncertainity of source and Properties of Entropy

Lecture 29 - Maximum Entropy of Source with M-ary Alphabet, Concave/Convex Functions and Jensens Inequality

Lecture 30 - Joint Entropy , Definition of Joint Entropy of Two Sources and Simple Examples for Joint Entropy Computation

Lecture 32 - Conditional Entropy, Example of Conditional Entropy and Properties of Conditional Entropy

Lecture 34 - Simple Example of Mutual Information and Practical Example of Mutual Information-Binary Symmetric Channel

Lecture 35 - Channel Capacity, Implications of Channel Capacity, Claude E. Shannon- Father of Information Theory and Example of Capacity of Binary Symmetric Channel

Lecture 40 - Capacity of Gaussian Channel - Part-II, Practical Implications and Maximum rate in bits\sec

Lecture 41 - Introduction to Source Coding and Data Compression, Variable Length codes and Unique Decodability

Lecture 51 - Matrix Representation of Convolutional Codes, Generator Matrix, Transform Domain Representation and Shift Register Architecture

Lecture 52 - State Diagram Representation of Convolutional Code, State transitions and Example of Code Generation using State transitions