Prompt Management
Version-controlled prompt templates with variable substitution, deployment, and playground testing.
Prompt Management
Prompt Management lets you create reusable, version-controlled prompt templates that can be deployed to your proxies and tested in an interactive playground.
Every change to a prompt automatically creates a new version, giving you full history and the ability to rollback at any time.
Core Concepts
Prompt Templates
Prompts are reusable text templates that can include:
- Content: The main prompt text with optional variable placeholders
- Variables: Dynamic values that are substituted at runtime
- Tags: Labels for organization and filtering
- Description: Documentation for what the prompt does
Variables
Use double curly braces to define variables in your prompts:
You are a {{role}} assistant helping with {{task}}.
The user's name is {{user_name}}.
Their preferred language is {{language}}.Each variable can have:
- Name: The identifier used in double braces
- Description: Documentation for the variable
- Default Value: Fallback when no value is provided
- Required: Whether the variable must be provided
Creating Prompts
Via Dashboard
- Navigate to Prompts in your dashboard
- Click Create Prompt
- Fill in the name, description, and content
- Add variables using the variable editor
- Add tags for organization
- Click Create
Via API
import requests
response = requests.post(
"https://api.bastio.com/api/prompts",
headers={
"Authorization": "Bearer YOUR_JWT_TOKEN",
"Content-Type": "application/json"
},
json={
"name": "Customer Support Assistant",
"description": "A helpful assistant for customer inquiries",
"content": "You are a {{role}} assistant. Help the user with {{task}}.",
"variables": [
{
"name": "role",
"description": "The role of the assistant",
"default_value": "customer support",
"required": True
},
{
"name": "task",
"description": "The task to help with",
"default_value": "their inquiry",
"required": False
}
],
"tags": ["support", "customer-facing"]
}
)
prompt = response.json()
print(f"Created prompt: {prompt['id']}")const response = await fetch('https://api.bastio.com/api/prompts', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_JWT_TOKEN',
'Content-Type': 'application/json'
},
body: JSON.stringify({
name: 'Customer Support Assistant',
description: 'A helpful assistant for customer inquiries',
content: 'You are a {{role}} assistant. Help the user with {{task}}.',
variables: [
{
name: 'role',
description: 'The role of the assistant',
default_value: 'customer support',
required: true
},
{
name: 'task',
description: 'The task to help with',
default_value: 'their inquiry',
required: false
}
],
tags: ['support', 'customer-facing']
})
});
const prompt = await response.json();
console.log(`Created prompt: ${prompt.id}`);curl -X POST https://api.bastio.com/api/prompts \
-H "Authorization: Bearer YOUR_JWT_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Customer Support Assistant",
"description": "A helpful assistant for customer inquiries",
"content": "You are a {{role}} assistant. Help the user with {{task}}.",
"variables": [
{
"name": "role",
"description": "The role of the assistant",
"default_value": "customer support",
"required": true
}
],
"tags": ["support", "customer-facing"]
}'package main
import (
"bytes"
"encoding/json"
"net/http"
)
type Variable struct {
Name string `json:"name"`
Description string `json:"description"`
DefaultValue string `json:"default_value"`
Required bool `json:"required"`
}
type CreatePromptRequest struct {
Name string `json:"name"`
Description string `json:"description"`
Content string `json:"content"`
Variables []Variable `json:"variables"`
Tags []string `json:"tags"`
}
func main() {
req := CreatePromptRequest{
Name: "Customer Support Assistant",
Description: "A helpful assistant for customer inquiries",
Content: "You are a {{role}} assistant. Help the user with {{task}}.",
Variables: []Variable{
{Name: "role", Description: "The role", DefaultValue: "customer support", Required: true},
},
Tags: []string{"support", "customer-facing"},
}
body, _ := json.Marshal(req)
request, _ := http.NewRequest("POST", "https://api.bastio.com/api/prompts", bytes.NewBuffer(body))
request.Header.Set("Authorization", "Bearer YOUR_JWT_TOKEN")
request.Header.Set("Content-Type", "application/json")
client := &http.Client{}
resp, _ := client.Do(request)
defer resp.Body.Close()
}Version Control
How Versioning Works
Every time you update a prompt's content or variables, a new version is automatically created:
- Automatic: No manual version creation needed
- Non-destructive: Previous versions are preserved
- Change Summaries: Add notes about what changed
- Instant Rollback: Restore any previous version
Viewing Version History
In the prompt detail page, click History to see:
- All previous versions with timestamps
- Who made each change
- Change summaries
- Full content preview
Rolling Back
To restore a previous version:
- Open the version history
- Select the version to restore
- Click Rollback
- A new version is created with the old content
Rollback is non-destructive - it creates a new version with the old content rather than deleting history.
Comparing Versions
Use the version comparison tool to see:
- Side-by-side content differences
- Performance metrics comparison
- Change indicators for latency, cost, and score
Deploying to Proxies
Deploy prompts to proxies to have them injected into every request.
Deployment Options
| Option | Description |
|---|---|
| Injection Position | Where to inject: prepend, append, or replace_system |
| Priority | Order of injection when multiple prompts are deployed (higher = first) |
| Variable Values | Runtime values for variables |
| Active/Inactive | Enable or disable the deployment |
Injection Positions
- Prepend: Add before the existing system message
- Append: Add after the existing system message
- Replace System: Completely replace the system message
Deploying via Dashboard
- Open the prompt detail page
- Go to the Deployments tab
- Click Deploy to Proxy
- Select the target proxy
- Choose injection position and priority
- Provide variable values
- Click Deploy
Deploying via API
import requests
response = requests.post(
"https://api.bastio.com/api/proxy-prompts/PROXY_ID/deploy",
headers={
"Authorization": "Bearer YOUR_JWT_TOKEN",
"Content-Type": "application/json"
},
json={
"prompt_id": "PROMPT_ID",
"injection_position": "prepend",
"priority": 100,
"variable_values": {
"role": "technical support",
"task": "code debugging"
}
}
)Playground Testing
The interactive playground lets you test prompts across different models.
Features
- Multi-Model: Test against GPT-4o, Claude 3.5, and other models
- Settings: Adjust temperature, max tokens, and top-p
- Sessions: Save experiments to revisit later
- Metrics: See token usage, latency, and cost per execution
Using the Playground
- Navigate to Prompts > Playground
- Enter or load a prompt
- Select a model
- Adjust settings as needed
- Type a message and click Send
- View the response with usage metrics
Session Saving
Save playground sessions to:
- Revisit experiments later
- Share with team members
- Compare results over time
Performance Analytics
Track how your prompts perform in production.
Metrics Tracked
| Metric | Description |
|---|---|
| Traces | Number of times the prompt was used |
| Latency | Average response time |
| Tokens | Average input/output token usage |
| Cost | Total cost based on token usage |
| Score | Average quality score from evaluations |
| Error Rate | Percentage of failed requests |
Viewing Analytics
- Open the prompt detail page
- Go to the Performance tab
- Select a time range (7, 14, or 30 days)
- View summary cards and trend charts
- Export data to CSV
API Reference
| Endpoint | Method | Description |
|---|---|---|
/api/prompts | POST | Create a new prompt |
/api/prompts/list | POST | List all prompts (with search/filter) |
/api/prompts/:id | GET | Get a single prompt |
/api/prompts/:id | PUT | Update a prompt |
/api/prompts/:id | DELETE | Delete a prompt |
/api/prompts/:id/versions | GET | Get version history |
/api/prompts/rollback | POST | Rollback to a version |
/api/prompts/compare | POST | Compare two versions |
/api/proxy-prompts/:proxyID/deploy | POST | Deploy prompt to proxy |
/api/proxy-prompts/:proxyID/:promptID | DELETE | Undeploy from proxy |
/api/proxy-prompts/:proxyID | GET | List proxy deployments |
/api/playground/execute | POST | Execute in playground |
/api/playground/sessions | POST | Save playground session |
Tier Limits
| Limit | Free | Starter | Pro | Enterprise |
|---|---|---|---|---|
| Prompt Templates | 3 | 25 | 100 | Unlimited |
| Playground/Day | 10 | 100 | 1,000 | Unlimited |
| Version History | Full | Full | Full | Full |
| Deployments | Unlimited | Unlimited | Unlimited | Unlimited |
Best Practices
Organize with Tags
# Good tag examples
- customer-support
- onboarding
- technical-help
- sales-assistant
- internal-toolsWrite Clear Variable Descriptions
{
"name": "company_name",
"description": "The name of the customer's company for personalization",
"default_value": "your company",
"required": false
}Use Change Summaries
When updating prompts, always add a change summary:
- "Added tone instruction for friendlier responses"
- "Fixed formatting issue in code examples"
- "Optimized for shorter responses to reduce cost"
Troubleshooting
Next Steps
- Observability - Track prompt performance in production
- Sessions - Group related requests for analysis
- API Reference - Complete API documentation