NOC:Principles of Communication Systems - Part II


Lecture 1 - Introduction to Digital Communication Systems


Lecture 2 - Spectrum of Transmitted Digital Communication Signal and Wide Sense Stationarity


Lecture 3 - Spectrum of Transmitted Digital Communication Signal, Autocorrelation Function and Power Spectral Density


Lecture 4 - Spectrum of Transmitted Digital Communication Signal, Relation to Energy Spectral Density and Introduction to AWGN Channel


Lecture 5 - Additive White Gaussian Noise (AWGN) Properties, Gaussian Noise and White Noise


Lecture 6 - Structure of Digital Communication Receiver, Receiver Filter and Signal-to-Noise Power Ratio (SNR)


Lecture 7 - Digital Communication Receiver, Noise Properties and Output Noise Power


Lecture 8 - Digital Communication Receiver, Optimal SNR and Matched Filter


Lecture 9 - Probability of Error in Digital Communication and Probability Density Functions of Output


Lecture 10 - Probability of Error in Digital Communication, Optimal Decison Rule and Gaussian Q function


Lecture 11 - Introduction to Binary Phase Shift Keying (BPSK) Modulation, Optimal Decision Rule and Probability of Bit-Error or Bit-Error Rate (BER)


Lecture 12 - Introduction to Amplitude Shift Keying (ASK) Modulation


Lecture 13 - Optimal Decision Rule for Amplitude Shift Keying (ASK), Bit Error Rate (BER) and Comparison with Binary Phase Shift Keying (BPSK) Modulation


Lecture 14 - Introduction to Signal Space Concept and Orthonormal Basis Signals


Lecture 15 - Introduction to Frequency Shift Keying (FSK)


Lecture 16 - Optimal Decision Rule for FSK, Bit Error Rate (BER) and Comparison with BPSK, ASK


Lecture 17 - Introduction to Quadrature Phase Shift Keying (QPSK)


Lecture 18 - Waveforms of Quadrature Phase Shift Keying (QPSK)


Lecture 19 - Matched Filtering, Bit Error Rate and Symbol Error Rate for Quadrature Phase Shift Keying (QPSK)


Lecture 20 - Introduction to M-ary PAM (Pulse Amplitude Modulation), Average Symbol Power and Decision rules


Lecture 21 - M-ary PAM (Pulse Amplitude Modulation) -Part-II, Optimal Decision Rule and Probability of Error


Lecture 22 - M-ary QAM (Quadrature Amplitude Modulation) Part-I, Introduction, Transmitted Waveform and Average Symbol Energy


Lecture 23 - M-ary QAM (Quadrature Amplitude Modulation) - Part-II, Optimal Decision Rule, Probability of Error and Contellation Diagram


Lecture 24 - M-ary PSK (Phase Shift Keying) Part-I, Introduction , Transmitted Waveform and Constellation Diagram


Lecture 25 - M-ary PSK (Phase Shift Keying) - Part-II, Optimal Decision Rule, Nearest Neighbor Criterion and Approximate Probability of Error


Lecture 26 - Introduction to Information Theory, Relevance of Information Theory and Characterization of Information


Lecture 27 - Definition of Entropy, Average of Information / Uncertainity of source and Properties of Entropy


Lecture 28 - Entropy Example- Binary Source Maximum and Minimum Entropy of Binary Source


Lecture 29 - Maximum Entropy of Source with M-ary Alphabet, Concave/Convex Functions and Jensens Inequality


Lecture 30 - Joint Entropy , Definition of Joint Entropy of Two Sources and Simple Examples for Joint Entropy Computation


Lecture 31 - Properties of Joint Entropy and Relation between Joint Entropy and Marginal Entropies


Lecture 32 - Conditional Entropy, Example of Conditional Entropy and Properties of Conditional Entropy


Lecture 33 - Mutual Information, Diagrammatic Representation and Properties of Mutual Information


Lecture 34 - Simple Example of Mutual Information and Practical Example of Mutual Information-Binary Symmetric Channel


Lecture 35 - Channel Capacity, Implications of Channel Capacity, Claude E. Shannon- Father of Information Theory and Example of Capacity of Binary Symmetric Channel


Lecture 36 - Differential Entropy and Example for Uniform Probability Density function


Lecture 37 - Differential Entropy of Gaussian Source and Insights


Lecture 38 - Joint Conditional/ Differential Entropies and Mutual Information


Lecture 39 - Capacity of Gaussian channel - Part I


Lecture 40 - Capacity of Gaussian Channel - Part-II, Practical Implications and Maximum rate in bits\sec


Lecture 41 - Introduction to Source Coding and Data Compression, Variable Length codes and Unique Decodability


Lecture 42 - Uniquely Decodable Codes, Prefix-free code, Instantaneous Code and Average Code length


Lecture 43 - Binary Tree Representation of Code, Example and Kraft Inequality


Lecture 44 - Lower Bound on Average Code Length and Kullback-Leibler Divergence


Lecture 45 - Optimal Code length, Constrained Optimization and Morse Code Example


Lecture 46 - Approaching Lower Bound on Average code length and Block Coding


Lecture 47 - Huffman Code, Algorithm, Example and Average Code Length


Lecture 48 - Introduction to channel coding, Rate of Code, Repetition Code and Hamming Distance


Lecture 49 - Introduction to Convolutional Codes, Binary Field Arithmetic and Linear Codes


Lecture 50 - Example of Convolutional Code Output and Convolution Operation for Code generation


Lecture 51 - Matrix Representation of Convolutional Codes, Generator Matrix, Transform Domain Representation and Shift Register Architecture


Lecture 52 - State Diagram Representation of Convolutional Code, State transitions and Example of Code Generation using State transitions


Lecture 53 - Trellis Representation of Convolutional Code and Valid Code Words


Lecture 54 - Decoding of the Convolutional Code, Minimum Hamming distance and Maximum Likelihood Codeword Estimate


Lecture 55 - Principle of Decoding of Convolutional code


Lecture 56 - Viterbi Decoder for Maximum Likelihood Decoding of Convolutional Code Using Trellis Representation, Branch Metric Calculation, State Metric Calculation and Example