Skip to content

TypeScript

The TypeScript runtime is split into focused packages:

PackageDescriptionnpm
@prompty/coreCore runtime: loader, pipeline, typesnpm
@prompty/openaiOpenAI provider (chat, responses, embedding, image)npm
@prompty/foundryMicrosoft Foundry providernpm
@prompty/anthropicAnthropic providernpm
Terminal window
# Core + OpenAI
npm install @prompty/core @prompty/openai
# Core + Microsoft Foundry
npm install @prompty/core @prompty/foundry
# Core + Anthropic
npm install @prompty/core @prompty/anthropic
import { execute } from "@prompty/core";
import "@prompty/openai"; // registers the OpenAI provider
const result = await execute("greeting.prompty", {
userName: "Jane",
});
console.log(result);
import { load } from "@prompty/core";
const agent = await load("chat.prompty");
console.log(agent.name); // "chat"
console.log(agent.model.id); // "gpt-4o"
import { load, prepare, run, execute } from "@prompty/core";
import "@prompty/openai";
// Step by step
const agent = await load("chat.prompty");
const messages = await prepare(agent, { question: "Hi" });
const result = await run(agent, messages);
// All-in-one
const result2 = await execute("chat.prompty", { question: "Hi" });
import { executeAgent } from "@prompty/core";
import "@prompty/openai";
function getWeather(city: string): string {
return `72°F and sunny in ${city}`;
}
const result = await executeAgent(
"agent.prompty",
{ question: "Weather in Seattle?" },
{ get_weather: getWeather },
);
import { load, prepare, run, process } from "@prompty/core";
import "@prompty/openai";
const agent = await load("chat.prompty");
const messages = await prepare(agent, { question: "Tell me a story" });
// Enable streaming
agent.model.options.additionalProperties = { stream: true };
const response = await run(agent, messages, { raw: true });
for await (const chunk of process(agent, response)) {
process.stdout.write(chunk);
}

Each provider package registers executors and processors via side-effect imports:

PackageProvider KeySDK
@prompty/openaiopenaiOpenAI Node SDK
@prompty/foundryfoundryMicrosoft Foundry SDK
@prompty/anthropicanthropicAnthropic SDK

Set environment variables or use a .env file (with dotenv):

Terminal window
OPENAI_API_KEY=sk-your-key-here

The ${env:VAR} syntax in .prompty files works the same way as in Python.