Skip to content
Prompty

Agency with observability

Prompty is a markdown file format for LLM prompts — define model config, inputs, tools, and templates in YAML frontmatter, then execute across Python and TypeScript.

One file, complete prompt

One file, complete prompt

A .prompty file pairs YAML frontmatter with a markdown prompt body in a single, portable asset. Define your model, inputs, tools, and template — then execute it from Python, TypeScript, or VS Code with one command.

  • Model configuration and connection details
  • Input schema with types and defaults
  • Tool definitions (function, MCP, OpenAPI)
  • Template rendering with Jinja2 or Mustache

Author in VS Code

Author in VS Code

The Prompty VS Code extension is your prompt engineering workbench. Write prompts with full language support, run them with F5, and inspect every detail in the built-in trace viewer.

  • Syntax highlighting and autocomplete
  • Connection management for OpenAI, Anthropic, and Microsoft Foundry
  • Live preview and interactive chat mode
  • One-click execution with inline results

Trace everything

Trace everything

Every execution produces a detailed trace — messages sent, tokens used, latency, and the raw API response. Debug prompt issues in seconds, not hours.

  • Built-in console, JSON file, and OpenTelemetry backends
  • VS Code trace viewer with conversation, timing, and raw views
  • Decorate your own functions with @trace for end-to-end visibility
  • Zero-overhead when no tracer is registered

The pipeline

The pipeline

Prompty processes every prompt through a four-stage pipeline — each stage is swappable and independently traceable.

  • Render — expand template variables with Jinja2 or Mustache
  • Parse — split role markers into a message list
  • Execute — call the LLM via any registered provider
  • Process — extract content, parse structured output, stream chunks
1

Portable by design

Same .prompty file works across Python, TypeScript, and VS Code — no rewriting.

2

Any provider

OpenAI, Microsoft Foundry, Anthropic — or build your own executor in minutes.

3

Observable by default

Built-in tracing with pluggable backends — console, files, or OpenTelemetry.

Agent mode & tools

Agent mode & tools

Define tools in your frontmatter and let the runtime handle the tool-calling loop automatically. Supports streaming, structured output, and the OpenAI Responses API.

  • Function tools with local callables
  • MCP server integration
  • OpenAPI-based tools
  • Automatic retry and error recovery

Open source, open standard

Open source, open standard

Prompty is built on the premise that even with increasing complexity in AI, a fundamental unit remains prompts. The format is open, extensible, and designed to grow with the ecosystem.

  • TypeSpec-defined schema with generated model code
  • Entry-point-based plugin system for custom providers
  • MIT licensed — contributions welcome