Prompt Engineering Basics
[!NOTE] Prompt Engineering is the practice of designing inputs for LLMs to produce optimal outputs. It is less about “trickery” and more about clear communication and constraint setting.
At its core, an LLM is a probabilistic engine that predicts the next token. A prompt is the initial set of tokens you provide to guide this prediction.
1. The Anatomy of a Prompt
A high-quality prompt is structured, not random. It typically contains four key components:
- Role (Persona): Who the AI should be (e.g., “You are a Senior Python Engineer”).
- Context: Background information relevant to the task.
- Instruction: The specific task you want the AI to perform.
- Constraints & Format: How the output should look (e.g., “JSON only”, “under 50 words”).
Roles in API Calls
Modern LLM APIs (like OpenAI’s Chat Completions) explicitly separate these roles:
- System: Sets the behavior and persona. This is “invisible” instruction that persists.
- User: The actual input or question from the human.
- Assistant: The model’s response. You can also pre-fill this to provide examples (Few-Shot).
2. Key Parameters
Controlling how the model generates text is as important as the text itself.
Temperature (0.0 - 2.0)
Controls randomness.
- Low (0.0 - 0.3): Deterministic. The model always picks the most likely next token. Best for code, math, and factual answers.
- High (0.8 - 1.5): Creative. The model might pick the 2nd or 3rd most likely token, leading to diverse outputs. Best for poetry, brainstorming.
Max Tokens
The hard limit on the output length. Use this to prevent the model from rambling or to cut costs.
3. Zero-Shot vs. Few-Shot Learning
LLMs are “in-context learners”. They can learn a task just by seeing examples in the prompt, without any parameter updates.
Zero-Shot
You simply ask the model to do something.
“Translate ‘Hello’ to Spanish.”
Few-Shot
You provide examples of the task within the prompt. This drastically improves performance on complex or specific formatting tasks.
“Translate English to Spanish: Dog → Perro Cat → Gato Hello ->”
[!TIP] Few-Shot Tip: Even providing one example (One-Shot) is significantly better than Zero-Shot for adhering to specific JSON formats or coding styles.
4. Code Implementation
Here is how you structure prompts in code using the official libraries.
5. Interactive Playground
Experiment with prompt structure and parameters below.