Use with Microsoft Foundry
Prerequisites
Section titled “Prerequisites”- A Microsoft Foundry project or an Azure OpenAI resource with a deployed model (e.g.,
gpt-4o-mini) - The resource endpoint:
- Foundry project endpoint (recommended):
https://<resource>.services.ai.azure.com/api/projects/<project> - Classic Azure OpenAI endpoint (legacy):
https://<resource>.openai.azure.com/
- Foundry project endpoint (recommended):
- Either an API key or Azure AD credentials
# API key authpip install prompty[jinja2,foundry]
# Azure AD auth (adds azure-identity)pip install prompty[jinja2,foundry] azure-identitynpm install @prompty/core @prompty/foundryOption A: API Key Authentication
Section titled “Option A: API Key Authentication”The simplest approach — use an API key from the Azure AI Foundry portal or Azure Portal.
1. Write the .prompty File
Section titled “1. Write the .prompty File”Create foundry-chat.prompty:
---name: foundry-chatdescription: Chat completion with Microsoft Foundry (API key)model: id: gpt-4o-mini provider: foundry apiType: chat connection: kind: key endpoint: ${env:AZURE_AI_PROJECT_ENDPOINT} apiKey: ${env:AZURE_AI_PROJECT_KEY} options: temperature: 0.7 maxOutputTokens: 1024inputSchema: properties: - name: question kind: string default: What is Microsoft Foundry?template: format: kind: jinja2 parser: kind: prompty---system:You are a helpful assistant. Answer concisely.
user:{{question}}2. Run It
Section titled “2. Run It”import prompty
result = prompty.execute( "foundry-chat.prompty", inputs={"question": "What services does Microsoft Foundry offer?"},)print(result)import { execute } from "@prompty/core";
const result = await execute("foundry-chat.prompty", { inputs: { question: "What services does Microsoft Foundry offer?" },});
console.log(result);3. Environment Setup
Section titled “3. Environment Setup”# .env — Foundry project endpoint (recommended)AZURE_AI_PROJECT_ENDPOINT=https://my-resource.services.ai.azure.com/api/projects/my-projectAZURE_AI_PROJECT_KEY=abc123...
# .env — Classic Azure OpenAI endpoint (legacy, also works)# AZURE_OPENAI_ENDPOINT=https://my-resource.openai.azure.com/# AZURE_OPENAI_API_KEY=abc123...Option B: Azure AD Authentication
Section titled “Option B: Azure AD Authentication”For production workloads, use Azure AD (Entra ID) with DefaultAzureCredential
— no API keys to manage or rotate.
1. Write the .prompty File
Section titled “1. Write the .prompty File”Create foundry-chat-aad.prompty:
---name: foundry-chat-aaddescription: Chat completion with Microsoft Foundry (Azure AD)model: id: gpt-4o-mini provider: foundry apiType: chat connection: kind: reference name: foundry_default options: temperature: 0.7 maxOutputTokens: 1024inputSchema: properties: - name: question kind: string default: What is Microsoft Foundry?template: format: kind: jinja2 parser: kind: prompty---system:You are a helpful assistant. Answer concisely.
user:{{question}}2. Register the Connection and Run
Section titled “2. Register the Connection and Run”import osfrom azure.identity import DefaultAzureCredentialimport prompty
# Register the named connection before loading the promptprompty.register_connection( "foundry_default", { "endpoint": os.environ["AZURE_AI_PROJECT_ENDPOINT"], "credential": DefaultAzureCredential(), },)
result = prompty.execute( "foundry-chat-aad.prompty", inputs={"question": "What is Microsoft Foundry?"},)print(result)import { execute, registerConnection } from "@prompty/core";import { DefaultAzureCredential } from "@azure/identity";
registerConnection("foundry_default", { endpoint: process.env.AZURE_AI_PROJECT_ENDPOINT!, credential: new DefaultAzureCredential(),});
const result = await execute("foundry-chat-aad.prompty", { inputs: { question: "What is Microsoft Foundry?" },});
console.log(result);3. Required Azure RBAC
Section titled “3. Required Azure RBAC”The identity used by DefaultAzureCredential needs the
Cognitive Services OpenAI User role on the resource:
az role assignment create \ --assignee <your-identity-object-id> \ --role "Cognitive Services OpenAI User" \ --scope /subscriptions/<sub>/resourceGroups/<rg>/providers/Microsoft.CognitiveServices/accounts/<resource>Comparing the Two Options
Section titled “Comparing the Two Options”API Key (kind: key) | Azure AD (kind: reference) | |
|---|---|---|
| Setup | Copy key from portal | Assign RBAC role |
| Security | Key in env var — can leak | Token-based — no static secret |
| Rotation | Manual | Automatic |
| Best for | Local dev, prototyping | Production, CI/CD |
Using a Shared Connection File
Section titled “Using a Shared Connection File”If multiple .prompty files share the same Foundry config, extract it into a JSON
file and reference it with ${file:...}:
shared/foundry-connection.json:
{ "kind": "key", "endpoint": "https://my-resource.services.ai.azure.com/api/projects/my-project", "apiKey": "${env:AZURE_AI_PROJECT_KEY}"}my-prompt.prompty:
---name: shared-connection-demomodel: id: gpt-4o-mini provider: foundry connection: ${file:shared/foundry-connection.json}---system:You are a helpful assistant.
user:{{question}}This avoids duplicating endpoint and auth config across prompts.
Environment Setup (Both Options)
Section titled “Environment Setup (Both Options)”AZURE_AI_PROJECT_ENDPOINT=https://my-resource.services.ai.azure.com/api/projects/my-projectAZURE_AI_PROJECT_KEY=abc123... # Only needed for API key auth
# Optional: specify deployment names per model typeAZURE_OPENAI_CHAT_DEPLOYMENT=gpt-4o-miniAZURE_OPENAI_EMBEDDING_DEPLOYMENT=text-embedding-3-small