Discrete Math & Information Theory

[!NOTE] This module explores the foundational elements of discrete mathematics and information theory, translating concepts like entropy, graphs, and transforms into concrete tools for modern AI models.

Welcome to Discrete Math & Information Theory! This module is the starting point for understanding how information is quantified, how data is structured as graphs, and how advanced models like Transformers and VAEs are powered by these concepts.

1. Chapters

  1. Information and Entropy
    • Learn the basic building blocks of information theory.
    • Understand Shannon entropy, cross-entropy, and KL divergence.
  2. Graph Theory Basics
    • Explore nodes, edges, adjacency matrices, and graphs.
    • Understand the foundation of Graph Neural Networks (GNNs).
  3. Fourier Transforms
    • Move from time domain to frequency domain.
    • Understand the Discrete Fourier Transform (DFT) and its role in signal processing and AI.
  4. Complex Numbers and Quaternions
    • Understand the math behind complex numbers and their arithmetic.
    • Explore quaternions for 3D rotations, vital in robotics and 3D computer vision.
  5. Capstone: Transformers & VAEs
    • Apply your knowledge to state-of-the-art generative models.
    • Understand how entropy, graphs, and transforms are used in Transformers and VAEs.
  6. Module Review: Flashcards & Cheat Sheet
    • Review key concepts and formulas.
    • Test your knowledge with interactive flashcards.

2. Learning Goals

By the end of this module, you will be able to:

  • Compute entropy, cross-entropy, and KL divergence to quantify information.
  • Represent data as graphs and understand adjacency matrices.
  • Decompose signals using Fourier Transforms.
  • Apply information theory to understand generative models like Transformers and VAEs.