Skip to content

Welcome to Prompty

Prompty is an asset class and format for LLM prompts designed to enhance observability, understandability, and portability for developers. A .prompty file combines structured YAML frontmatter with a markdown prompt body — making prompts versionable, testable, and executable across languages.

---
name: greeting
model:
id: gpt-4o-mini
provider: openai
connection:
kind: key
apiKey: ${env:OPENAI_API_KEY}
inputs:
- name: userName
kind: string
default: World
template:
format:
kind: jinja2
parser:
kind: prompty
---
system:
You are a friendly assistant.
user:
Say hello to {{userName}}.
import prompty
result = prompty.invoke("greeting.prompty", inputs={"userName": "Jane"})
print(result) # "Hello Jane! How can I help you today?"
  • One file, complete prompt — model config, inputs, tools, and instructions in a single .prompty file
  • Multi-language runtimes — Python and TypeScript today, C# coming soon
  • Pipeline architecture — render → parse → execute → process, each stage swappable
  • Agent mode — built-in tool-calling loop with error recovery
  • Structured output — define output schemas, get typed JSON back
  • Streaming — chunk-by-chunk delivery with full tracing support
  • Pluggable tracing — console, JSON file, or OpenTelemetry backends
  • VS Code extension — syntax highlighting, autocomplete, run & debug, trace viewer
🚀 Getting StartedInstall, write your first prompt, run it
📖 Core ConceptsFile format, pipeline, connections, tools, tracing
📋 Schema ReferenceAll frontmatter properties (auto-generated)
🔧 How-To GuidesPractical recipes for common tasks
🐍 PythonPython runtime guide
📘 TypeScriptTypeScript runtime guide

Prompty is open source. We welcome contributions to the runtimes, tooling, and documentation. See the Contributing guide to get started.