Analog vs Digital Signal bit (signal processing) Source-Channel Separation Theorem information information source can be modeled as a random variable surprise information value and entropy joint entropy joint entropy of independent events Source Coding non-singular code uniquely decodeable Prefix-Free code, which is uniquely decodeable and instantaneously decodable any prefix free code for a source has at least entropy length prefix-free code has the same codeword lengths as any code Average Codeword Length generating Prefix-Free code with Huffman Coding Huffman Coding is Bounded Block Coding Boundedness of Block Coding and why it saves space! Diadic Source Shannon’s Source-Coding Theorem Entropy Rate of the Source Non-IID Sequence Can Have Smaller Entropy (which is why we use Entropy Rate of the Source as the measure of how good a code is) signal sinusoid and Fourier Series Fourier Series as exactly a shifted sum of sinusoids General Fourier Decomposition L-periodic functions and triangle wave Fourier Series components form a basis Bandwidth Finite-Bandwidth Signal Discrete Fourier Transform Lossless Sampling and nyquist sampling theorem Finite-Bandwidth Signal Baseband Signal and Passband Signal Interpolation Zero-Hold Interpolation best: Infinite-Degree Polynomial Interpolation (uses sinc function) Shannon’s Nyquist Theorem