Skip to content

Welcome to Prompty

Prompty is a file format (.prompty) for LLM prompts. Write your prompt once — model config, inputs, tools, and template in a single markdown file — then run it from Python, TypeScript, C#, or Rust. Treat prompts as code: version them in git, test them in CI, share them across teams, and deploy with confidence.

---
name: greeting
model:
id: gpt-4o-mini
provider: openai
connection:
kind: key
apiKey: ${env:OPENAI_API_KEY}
inputs:
- name: userName
kind: string
default: World
---
system:
You are a friendly assistant.
user:
Say hello to {{userName}}.
import prompty
result = prompty.invoke("greeting.prompty", inputs={"userName": "Jane"})
  • Four runtimes — Python, TypeScript, C#, and Rust
  • Pipeline architecture — render → parse → execute → process, each stage swappable
  • Built-in tracing — console, JSON file, and OpenTelemetry backends
  • Provider support — OpenAI, Azure / Foundry, and Anthropic
  • VS Code extension — syntax highlighting, autocomplete, live preview, and trace viewer

Prompty is open source. We welcome contributions to the runtimes, tooling, and documentation. See the Contributing guide to get started.