Core Concepts
This section covers the fundamental concepts behind Prompty. Each page explains
a key aspect of how Prompty processes .prompty files into LLM results.
The Basics
Section titled “The Basics”- The .prompty File Format — anatomy of a
.promptyfile: frontmatter, body, role markers, template syntax - Pipeline Architecture — the four-stage pipeline: render → parse → execute → process
Configuration
Section titled “Configuration”- Connections — how Prompty authenticates with LLM providers (API keys, AAD, registries)
- Tools — function, MCP, OpenAPI, and custom tool definitions
Advanced Features
Section titled “Advanced Features”- Tracing & Observability — pluggable tracer backends, @trace decorator, OpenTelemetry
- Streaming — real-time chunk delivery, PromptyStream, async iteration
- Structured Output — typed JSON responses via output schemas
- Agent Mode — automatic tool-calling loop with error recovery