Agency with observability

Prompty is a new asset class and format for LLM prompts that aims to provide observability, understandability, and portability for developers.

Prompty.ai logo of a capital P with cartoon cute eyes and a thick border.

What is Prompty?

Prompty is an asset class and format for LLM prompts that aims to provide observability, understandability, and portability for developers - the primary goal is to speed up the developer inner loop.

Prompty is comprised of 3 things

  • the specification,
  • its tooling,
  • and runtime.
Example of a Prompty file in VS Code using the Prompty extension.

The specification

Prompty is intended to be a language agnostic asset class for creating and managing prompts.

  • Uses common markdown format
  • Modified front-matter to specify metadata, model settings, sample data (among other things)
  • Content in a standard template format
Example of a Prompty file in VS Code using the Prompty extension.

The tooling

Given the standard specification, there's a lot of nice things we can give developers in their environment.

  • Front matter autocompletion
  • Colorization / syntax highlighting
  • Validation (with red squiggles for undefined variables)
  • Quick run
  • Code generation
  • Evaluation generation
Prompty tooling example of the Prompty extension in VS Code

The runtime

Prompty runtime is the whatever engine that understands and can execute the format. As a standalone file, it can't really do anything without the help of the extension (when developing) or the runtime (when running).

  • Targeting LangChain, Semantic Kernel, and Prompt Flow as supporting runtimes
  • Works in Python (Prompt Flow and LangChain)
  • Works in C# (Semantic Kernel)
  • (Future Work) works in TypeScript/JavaScript
  • Understood in Azure AI Studio
The Prompty runtime processing a .prompty file in VS Code using the Prompty extension.
What are the benefits of Prompty?
1

Feel confident while building

Understand what's coming in and going out and how to manage it effectively.
2

Language agnostic

Use with any language or framework you are familiar with.
3

Flexible and simple

Integrate into whatever development environments or workflows you have.

Standards open doors

By working in a common format we open up opportunities for new improvements.
  • By default all prompty executions will produce tracing for each prompty called
  • Developers can add additional tracing via simple SDK functions
  • Tracing output uses OpenTelemetry so any/all existing tooling around that standard can be used to visualize the tracing output.

Works for everyone

Prompty is built on the premise that even with increasing complexity in AI, a fundamental unit remains prompts. And understanding this can lead to more innovative developments in AI applications, for everyone.