Information Theory

Module Contents

1. Entropy & KL Divergence

Master the foundations of Information Theory: Shannon Entropy and Kullback-Leibler Divergence. Learn how to quantify uncertainty and measure the distance between distributions.

2. Mutual Information

Explore the relationship between random variables. Understand Joint Entropy, Conditional Entropy, and how Mutual Information quantifies dependency.

3. Cross-Entropy Loss

Deep dive into the standard loss function for classification in Machine Learning. Understand the math behind Softmax, Log-Sum-Exp stability, and why it penalizes confident errors.

Review & Cheat Sheet

Quickly review key concepts, formulas, and test your knowledge with interactive flashcards.