AzureAI FoundryEnterpriseMulti-Provider

Announcing Azure AI Foundry Support: Five AI Providers, One Azure Credential

Access OpenAI GPT-4o, Meta Llama, Mistral AI, DeepSeek, and Microsoft Phi models through a single Azure credential. Bastio now supports Azure AI Foundry with full enterprise security.

Daniel S. Jacobsen, Founder & CEONovember 24, 2025
Announcing Azure AI Foundry Support: Five AI Providers, One Azure Credential
Share:

Announcing Azure AI Foundry Support

For enterprises already running on Microsoft Azure, managing AI across multiple vendors has meant juggling separate accounts, different APIs, and fragmented billing. Today, we're excited to announce full support for Azure AI Foundry in Bastio—bringing five AI providers together under your existing Azure infrastructure.

One Credential, Five Vendors

Azure AI Foundry's Model Catalog brings together leading AI models from multiple providers, all accessible through a single Azure API key:

  • OpenAI: GPT-4o, GPT-4o Mini, GPT-4 Turbo, o1-preview, o1-mini, and more
  • Meta Llama: Llama 3.1 405B, 70B, 8B with instruction tuning
  • Mistral AI: Mistral Large, Small, Nemo, and Codestral for code generation
  • DeepSeek: DeepSeek R1 and V3 reasoning models
  • Microsoft: Phi-4 and Phi-3.5 efficient models

One Azure credential. One invoice. Five world-class AI providers—all with an OpenAI-compatible API.

Why Azure AI Foundry?

For teams invested in Microsoft Azure, AI Foundry offers compelling advantages over managing separate provider relationships:

  • Unified Billing: Consolidate all AI spend in your existing Azure invoice
  • Enterprise Compliance: Leverage Azure's certifications (SOC 2, HIPAA, ISO 27001, FedRAMP High)
  • Network Security: VNet integration, private endpoints, and customer-managed keys
  • Global Scale: Deploy across 60+ Azure regions for low latency and data residency
  • Simple Authentication: Just an API key—no complex service account JSON or IAM policies

With Bastio's integration, you get all these benefits plus our comprehensive security layer.

Enterprise Security Meets Azure

When you route your Azure AI Foundry traffic through Bastio, you get the best of both worlds:

Real-Time Threat Detection

Detect and block prompt injection, jailbreaks, and malicious usage before it reaches your models—regardless of which provider you're using. Our security engine analyzes every request in real-time.

PII Protection

Automatically detect and redact sensitive data across all five providers, ensuring consistent compliance with GDPR, HIPAA, and CCPA. The same rules apply whether you're using GPT-4o or Llama.

Unified Observability

One dashboard to monitor usage across OpenAI, Meta, Mistral, DeepSeek, and Microsoft models. Track costs, analyze threats, and maintain audit logs in a single place—no more switching between vendor consoles.

Getting Started

Setting up Azure AI Foundry with Bastio takes just a few minutes:

  1. Create an AI Hub: In Azure Portal, create an Azure AI Hub and project in your preferred region
  2. Deploy Models: Use the Model Catalog to deploy your chosen models (partner models require marketplace acceptance)
  3. Get Your API Key: Copy your API key from the project settings
  4. Create a Bastio Proxy: Add your Azure credentials and start making requests
from openai import OpenAI

client = OpenAI(
    base_url="https://api.bastio.ai/v1/guard/{PROXY_ID}/v1",
    api_key="your-bastio-api-key"
)

# Switch between providers by changing the model name
response = client.chat.completions.create(
    model="gpt-4o",  # or Meta-Llama-3.1-70B-Instruct, Mistral-Large-2407, DeepSeek-R1
    messages=[{"role": "user", "content": "Hello from Azure AI Foundry!"}]
)

OpenAI-Compatible API

One of Azure AI Foundry's biggest advantages is its unified API. Unlike other multi-provider platforms that require different SDKs or API formats for each vendor, Azure provides an OpenAI-compatible interface for all models:

  • Same request format for GPT-4o and Llama
  • Same streaming protocol for Mistral and DeepSeek
  • Same error handling across all providers

This means you can switch between providers with a single line change—just update the model name.

Available Now

Azure AI Foundry support is available today for all Bastio customers. To get started, visit your Dashboard and create a new proxy with the "Azure AI Foundry" provider.

For detailed setup instructions including partner model deployment, check out our Azure AI Foundry Documentation.

Enjoyed this article? Share it!

Share:

Ready to Secure Your AI Applications?

Get started with Bastio today and protect your LLM applications from emerging threats.