Skip to content

CLI Usage

The Prompty CLI provides a convenient way to execute prompts, debug issues, and integrate Prompty into scripts and CI/CD pipelines. It comes built-in with the Python runtime installation.

The CLI is automatically installed with the Prompty Python package:

Terminal window
pip install "prompty[azure]"

Verify the installation:

Terminal window
prompty --version

Run a prompty file with the basic command:

Terminal window
prompty -s path/to/your/prompt.prompty

Load environment variables from a file:

Terminal window
prompty -s prompt.prompty -e .env

Example .env file:

Terminal window
AZURE_OPENAI_ENDPOINT=https://your-endpoint.openai.azure.com/
AZURE_OPENAI_API_KEY=your-api-key
AZURE_OPENAI_DEPLOYMENT=gpt-35-turbo

Pass input variables using JSON:

Terminal window
prompty -s prompt.prompty --inputs '{"name": "Alice", "topic": "AI"}'

Or from a JSON file:

Terminal window
# inputs.json
{
"customer_name": "John Doe",
"question": "What are your business hours?"
}
prompty -s prompt.prompty --inputs-file inputs.json

Override model configuration from the command line:

Terminal window
prompty -s prompt.prompty \
--config '{"type": "azure_openai", "azure_deployment": "gpt-4"}' \
-e .env

The CLI includes tracing by default. Control trace output:

Terminal window
# Basic tracing (default)
prompty -s prompt.prompty -e .env
# Verbose tracing
prompty -s prompt.prompty -e .env --verbose
# Save traces to file
prompty -s prompt.prompty -e .env --trace-dir ./traces

Enable streaming for real-time output:

Terminal window
prompty -s prompt.prompty --stream -e .env

Use the CLI in interactive chat mode for multi-turn conversations:

Terminal window
prompty -s chat_prompt.prompty --chat -e .env

In chat mode:

  • Type your messages and press Enter
  • Use /exit to quit
  • Use /clear to clear conversation history
  • Use /help for available commands
OptionShortDescriptionExample
--source-sPath to prompty file-s prompt.prompty
--env-eEnvironment file path-e .env
--inputs-iJSON input variables-i '{"name": "Alice"}'
--inputs-fileInput variables from file--inputs-file inputs.json
--config-cModel configuration JSON-c '{"temperature": 0.7}'
--connectionConnection name--connection production
--streamEnable streaming output--stream
--chatInteractive chat mode--chat
--verbose-vVerbose output-v
--trace-dirDirectory for trace files--trace-dir ./traces
--output-oOutput file path-o result.txt
--format-fOutput format (json, text)-f json
--help-hShow help message-h
--versionShow version--version
Terminal window
prompty -s prompt.prompty -e .env
# Output: Hello! How can I help you today?
Terminal window
prompty -s prompt.prompty -e .env --format json

JSON output includes metadata:

{
"content": "Hello! How can I help you today?",
"usage": {
"prompt_tokens": 45,
"completion_tokens": 12,
"total_tokens": 57
},
"model": "gpt-35-turbo",
"finish_reason": "stop"
}
Terminal window
prompty -s prompt.prompty -e .env -o response.txt
Terminal window
# Set up environment
export AZURE_OPENAI_ENDPOINT="https://your-endpoint.openai.azure.com/"
export AZURE_OPENAI_API_KEY="your-api-key"
export AZURE_OPENAI_DEPLOYMENT="gpt-35-turbo"
# Execute
prompty -s prompt.prompty -e .env
Terminal window
# Set up environment
export OPENAI_API_KEY="sk-your-api-key"
# Execute with OpenAI configuration
prompty -s prompt.prompty \
--config '{"type": "openai", "model": "gpt-3.5-turbo"}' \
-e .env
Terminal window
# GitHub Models example
export GITHUB_TOKEN="your-github-token"
prompty -s prompt.prompty \
--config '{"type": "serverless", "endpoint": "https://models.inference.ai.azure.com", "model": "gpt-4o-mini"}' \
-e .env

File not found:

Terminal window
prompty -s nonexistent.prompty
# Error: File 'nonexistent.prompty' not found

Invalid JSON inputs:

Terminal window
prompty -s prompt.prompty --inputs '{"name": "Alice"'
# Error: Invalid JSON in inputs

Missing environment variables:

Terminal window
prompty -s prompt.prompty -e .env --verbose
# Will show which environment variables are missing
Terminal window
prompty -s prompt.prompty -e .env --verbose

Verbose output includes:

  • Environment variable loading
  • Prompt parsing details
  • Model configuration
  • Request/response details
  • Execution timing

Process multiple prompts:

process_prompts.sh
#!/bin/bash
for prompt in prompts/*.prompty; do
echo "Processing $prompt..."
prompty -s "$prompt" -e .env -o "results/$(basename "$prompt" .prompty).txt"
done

Use in GitHub Actions:

.github/workflows/test-prompts.yml
name: Test Prompts
on: [push]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: '3.11'
- name: Install Prompty
run: pip install "prompty[azure]"
- name: Test prompts
env:
AZURE_OPENAI_ENDPOINT: ${{ secrets.AZURE_OPENAI_ENDPOINT }}
AZURE_OPENAI_API_KEY: ${{ secrets.AZURE_OPENAI_API_KEY }}
run: |
for prompt in tests/*.prompty; do
prompty -s "$prompt" --format json -o "results/$(basename "$prompt" .prompty).json"
done

The CLI returns appropriate exit codes for scripting:

  • 0: Success
  • 1: General error (file not found, invalid JSON, etc.)
  • 2: Configuration error
  • 3: Authentication error
  • 4: API error

For multiple prompts, consider connection reuse:

Terminal window
# Instead of multiple CLI calls, use Python script
python batch_process.py
batch_process.py
import prompty
import prompty.azure
prompts = ["prompt1.prompty", "prompt2.prompty", "prompt3.prompty"]
for prompt_file in prompts:
result = prompty.execute(prompt_file)
print(f"{prompt_file}: {result}")

For large responses, use file output instead of console:

Terminal window
prompty -s large_prompt.prompty -e .env -o large_response.txt
Terminal window
# Interactive customer support
prompty -s customer_support.prompty --chat -e .env
Terminal window
# Summarize with custom inputs
prompty -s summarize.prompty \
--inputs '{"document": "path/to/document.txt", "max_length": 200}' \
-e .env \
-o summary.txt
Terminal window
# Review code changes
prompty -s code_review.prompty \
--inputs-file review_context.json \
--format json \
-o review_results.json \
-e .env

Want to Contribute To the Project? - Updated Guidance Coming Soon.