Bastio

Prompt Management

Version-controlled prompt templates with variable substitution, deployment, and playground testing.

Prompt Management

Prompt Management lets you create reusable, version-controlled prompt templates that can be deployed to your proxies and tested in an interactive playground.

Every change to a prompt automatically creates a new version, giving you full history and the ability to rollback at any time.

Core Concepts

Prompt Templates

Prompts are reusable text templates that can include:

  • Content: The main prompt text with optional variable placeholders
  • Variables: Dynamic values that are substituted at runtime
  • Tags: Labels for organization and filtering
  • Description: Documentation for what the prompt does

Variables

Use double curly braces to define variables in your prompts:

You are a {{role}} assistant helping with {{task}}.

The user's name is {{user_name}}.
Their preferred language is {{language}}.

Each variable can have:

  • Name: The identifier used in double braces
  • Description: Documentation for the variable
  • Default Value: Fallback when no value is provided
  • Required: Whether the variable must be provided

Creating Prompts

Via Dashboard

  1. Navigate to Prompts in your dashboard
  2. Click Create Prompt
  3. Fill in the name, description, and content
  4. Add variables using the variable editor
  5. Add tags for organization
  6. Click Create

Via API

import requests

response = requests.post(
    "https://api.bastio.com/api/prompts",
    headers={
        "Authorization": "Bearer YOUR_JWT_TOKEN",
        "Content-Type": "application/json"
    },
    json={
        "name": "Customer Support Assistant",
        "description": "A helpful assistant for customer inquiries",
        "content": "You are a {{role}} assistant. Help the user with {{task}}.",
        "variables": [
            {
                "name": "role",
                "description": "The role of the assistant",
                "default_value": "customer support",
                "required": True
            },
            {
                "name": "task",
                "description": "The task to help with",
                "default_value": "their inquiry",
                "required": False
            }
        ],
        "tags": ["support", "customer-facing"]
    }
)

prompt = response.json()
print(f"Created prompt: {prompt['id']}")
const response = await fetch('https://api.bastio.com/api/prompts', {
  method: 'POST',
  headers: {
    'Authorization': 'Bearer YOUR_JWT_TOKEN',
    'Content-Type': 'application/json'
  },
  body: JSON.stringify({
    name: 'Customer Support Assistant',
    description: 'A helpful assistant for customer inquiries',
    content: 'You are a {{role}} assistant. Help the user with {{task}}.',
    variables: [
      {
        name: 'role',
        description: 'The role of the assistant',
        default_value: 'customer support',
        required: true
      },
      {
        name: 'task',
        description: 'The task to help with',
        default_value: 'their inquiry',
        required: false
      }
    ],
    tags: ['support', 'customer-facing']
  })
});

const prompt = await response.json();
console.log(`Created prompt: ${prompt.id}`);
curl -X POST https://api.bastio.com/api/prompts \
  -H "Authorization: Bearer YOUR_JWT_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Customer Support Assistant",
    "description": "A helpful assistant for customer inquiries",
    "content": "You are a {{role}} assistant. Help the user with {{task}}.",
    "variables": [
      {
        "name": "role",
        "description": "The role of the assistant",
        "default_value": "customer support",
        "required": true
      }
    ],
    "tags": ["support", "customer-facing"]
  }'
package main

import (
    "bytes"
    "encoding/json"
    "net/http"
)

type Variable struct {
    Name         string `json:"name"`
    Description  string `json:"description"`
    DefaultValue string `json:"default_value"`
    Required     bool   `json:"required"`
}

type CreatePromptRequest struct {
    Name        string     `json:"name"`
    Description string     `json:"description"`
    Content     string     `json:"content"`
    Variables   []Variable `json:"variables"`
    Tags        []string   `json:"tags"`
}

func main() {
    req := CreatePromptRequest{
        Name:        "Customer Support Assistant",
        Description: "A helpful assistant for customer inquiries",
        Content:     "You are a {{role}} assistant. Help the user with {{task}}.",
        Variables: []Variable{
            {Name: "role", Description: "The role", DefaultValue: "customer support", Required: true},
        },
        Tags: []string{"support", "customer-facing"},
    }

    body, _ := json.Marshal(req)
    request, _ := http.NewRequest("POST", "https://api.bastio.com/api/prompts", bytes.NewBuffer(body))
    request.Header.Set("Authorization", "Bearer YOUR_JWT_TOKEN")
    request.Header.Set("Content-Type", "application/json")

    client := &http.Client{}
    resp, _ := client.Do(request)
    defer resp.Body.Close()
}

Version Control

How Versioning Works

Every time you update a prompt's content or variables, a new version is automatically created:

  1. Automatic: No manual version creation needed
  2. Non-destructive: Previous versions are preserved
  3. Change Summaries: Add notes about what changed
  4. Instant Rollback: Restore any previous version

Viewing Version History

In the prompt detail page, click History to see:

  • All previous versions with timestamps
  • Who made each change
  • Change summaries
  • Full content preview

Rolling Back

To restore a previous version:

  1. Open the version history
  2. Select the version to restore
  3. Click Rollback
  4. A new version is created with the old content

Rollback is non-destructive - it creates a new version with the old content rather than deleting history.

Comparing Versions

Use the version comparison tool to see:

  • Side-by-side content differences
  • Performance metrics comparison
  • Change indicators for latency, cost, and score

Deploying to Proxies

Deploy prompts to proxies to have them injected into every request.

Deployment Options

OptionDescription
Injection PositionWhere to inject: prepend, append, or replace_system
PriorityOrder of injection when multiple prompts are deployed (higher = first)
Variable ValuesRuntime values for variables
Active/InactiveEnable or disable the deployment

Injection Positions

  • Prepend: Add before the existing system message
  • Append: Add after the existing system message
  • Replace System: Completely replace the system message

Deploying via Dashboard

  1. Open the prompt detail page
  2. Go to the Deployments tab
  3. Click Deploy to Proxy
  4. Select the target proxy
  5. Choose injection position and priority
  6. Provide variable values
  7. Click Deploy

Deploying via API

import requests

response = requests.post(
    "https://api.bastio.com/api/proxy-prompts/PROXY_ID/deploy",
    headers={
        "Authorization": "Bearer YOUR_JWT_TOKEN",
        "Content-Type": "application/json"
    },
    json={
        "prompt_id": "PROMPT_ID",
        "injection_position": "prepend",
        "priority": 100,
        "variable_values": {
            "role": "technical support",
            "task": "code debugging"
        }
    }
)

Playground Testing

The interactive playground lets you test prompts across different models.

Features

  • Multi-Model: Test against GPT-4o, Claude 3.5, and other models
  • Settings: Adjust temperature, max tokens, and top-p
  • Sessions: Save experiments to revisit later
  • Metrics: See token usage, latency, and cost per execution

Using the Playground

  1. Navigate to Prompts > Playground
  2. Enter or load a prompt
  3. Select a model
  4. Adjust settings as needed
  5. Type a message and click Send
  6. View the response with usage metrics

Session Saving

Save playground sessions to:

  • Revisit experiments later
  • Share with team members
  • Compare results over time

Performance Analytics

Track how your prompts perform in production.

Metrics Tracked

MetricDescription
TracesNumber of times the prompt was used
LatencyAverage response time
TokensAverage input/output token usage
CostTotal cost based on token usage
ScoreAverage quality score from evaluations
Error RatePercentage of failed requests

Viewing Analytics

  1. Open the prompt detail page
  2. Go to the Performance tab
  3. Select a time range (7, 14, or 30 days)
  4. View summary cards and trend charts
  5. Export data to CSV

API Reference

EndpointMethodDescription
/api/promptsPOSTCreate a new prompt
/api/prompts/listPOSTList all prompts (with search/filter)
/api/prompts/:idGETGet a single prompt
/api/prompts/:idPUTUpdate a prompt
/api/prompts/:idDELETEDelete a prompt
/api/prompts/:id/versionsGETGet version history
/api/prompts/rollbackPOSTRollback to a version
/api/prompts/comparePOSTCompare two versions
/api/proxy-prompts/:proxyID/deployPOSTDeploy prompt to proxy
/api/proxy-prompts/:proxyID/:promptIDDELETEUndeploy from proxy
/api/proxy-prompts/:proxyIDGETList proxy deployments
/api/playground/executePOSTExecute in playground
/api/playground/sessionsPOSTSave playground session

Tier Limits

LimitFreeStarterProEnterprise
Prompt Templates325100Unlimited
Playground/Day101001,000Unlimited
Version HistoryFullFullFullFull
DeploymentsUnlimitedUnlimitedUnlimitedUnlimited

Best Practices

Organize with Tags

# Good tag examples
- customer-support
- onboarding
- technical-help
- sales-assistant
- internal-tools

Write Clear Variable Descriptions

{
  "name": "company_name",
  "description": "The name of the customer's company for personalization",
  "default_value": "your company",
  "required": false
}

Use Change Summaries

When updating prompts, always add a change summary:

  • "Added tone instruction for friendlier responses"
  • "Fixed formatting issue in code examples"
  • "Optimized for shorter responses to reduce cost"

Troubleshooting

Next Steps