Manage, Version, and Deploy Your AI Prompts
Create reusable prompt templates with full version control, deploy them to proxies with variable substitution, and test them in an interactive playground.
Full History
Complete version control with rollback
Deploy
Inject prompts into any proxy
Playground
Test prompts across models
Analytics
Per-version performance metrics
Core Capabilities
A complete prompt engineering workflow from creation to deployment and optimization.
Prompt Templates
Create reusable templates with variables, tags, and descriptions for organized prompt management.
- Rich text editor
- Variable definitions
- Tag organization
- Search and filter
Version Control
Every change creates a new version with full history, change summaries, and instant rollback.
- Automatic versioning
- Change summaries
- Instant rollback
- Version comparison
Proxy Deployment
Deploy prompts to any proxy with flexible injection positions and runtime variable substitution.
- Prepend/append/replace
- Priority ordering
- Variable substitution
- Multi-proxy deploy
Dynamic Variables
Use variables to create flexible, reusable prompts that adapt to different contexts at runtime.
Variable Syntax
Define variables using double curly braces. Each variable can have a name, description, default value, and required flag.
You are a {{role}} assistant helping with {{task}}.
The user's name is {{user_name}}.
Their preferred language is {{language:English}}.
Focus on being {{tone:professional}} and helpful.Runtime Substitution
When deployed to a proxy, variables are substituted with values provided in the deployment configuration or at runtime.
{
"variable_values": {
"role": "customer support",
"task": "order inquiries",
"user_name": "John",
"tone": "friendly"
}
}Default Values
Set fallback values for optional variables
Descriptions
Document what each variable is for
Required Flags
Mark variables that must be provided
Interactive Playground
Test prompts in real-time across different models. Experiment with settings, save sessions, and iterate quickly.
Test the same prompt across GPT-4o, Claude 3.5, and other models to compare outputs and find the best fit for your use case.
Fine-tune temperature, max tokens, and top-p parameters to see how they affect output quality and creativity.
Session Management
Save playground sessions to revisit experiments later, share with team members, or compare results over time.
Performance Analytics
Track how each version of your prompt performs in production. Compare metrics and optimize with data.
Usage Metrics
Track trace counts, token usage, and request volume for each prompt version over time.
Latency Tracking
Monitor average response times to identify slow prompts and optimize for speed.
Cost Analysis
See exact costs per version based on token usage and model pricing.
Quality Scores
Track average quality scores from evaluations to measure prompt effectiveness.
Error Rates
Identify problematic versions with high error rates and debug issues quickly.
CSV Export
Export performance data for custom analysis, reporting, or compliance documentation.
Version Comparison
Compare any two versions side-by-side to understand how changes affect performance.
Content Diff
See exactly what changed between versions with highlighted differences in prompt content and variables.
Metrics Comparison
Compare latency, cost, score, error rate, and token usage between versions with change indicators.
AI-Powered Insights
Get automatically generated summaries that identify whether Version B is uniformly better, or highlight specific trade-offs between metrics like cost vs. quality.
Available on All Plans
Prompt management is included with all Bastio plans. Higher tiers unlock more prompts and playground executions.
| Feature | Free | Starter | Pro | Enterprise |
|---|---|---|---|---|
| Prompt Templates | 3 | 25 | 100 | Unlimited |
| Playground Executions/Day | 10 | 100 | 1,000 | Unlimited |
| Version History | ||||
| Proxy Deployment | ||||
| Performance Analytics | ||||
| Version Comparison |
Start Managing Your Prompts
Create version-controlled prompt templates, deploy them to your proxies, and optimize with data-driven insights.
Questions about prompt management? Contact us for a free consultation.