LLM Basics

Understand the fundamental building blocks of Generative AI. This module covers how LLMs work under the hood, from tokenization to the Transformer architecture.

Module Contents

1. What are LLMs?

Understand LLMs as probabilistic engines. Includes a Next Token Prediction simulator.

2. Tokenization

How text becomes numbers. Explore BPE vs Character tokenization with an Interactive Tokenizer Playground.

3. The Transformer Architecture

Dive into the “Attention Is All You Need” paper. Visualize Self-Attention weights interactively.

Review: Flashcards & Cheat Sheet

Test your knowledge with interactive flashcards and a quick reference guide.