AI & Cloud Infrastructure
November 28, 2025Running Open-Source AI Models at Scale: Azure Container Apps, AKS, and On-Premise Deployments - Microsoft Ignite 2025
Microsoft Ignite BRK117: Deploy open-source AI models (Llama 3.3, Mistral) with Azure Container Apps serverless GPUs, AKS with Kaido workflows, and on-premise infrastructure. Cost reduction 60-85%, data sovereignty, and hybrid architectures with Azure Arc.
Microsoft Ignite 2025
Azure Container Apps
Azure Kubernetes Service
Open-Source AI
Llama 3.3
Mistral AI
Serverless GPU
Kaido
vLLM
On-Premise AI
Azure Arc
Hybrid Cloud
GPU Orchestration
Cost Optimization
Data Sovereignty
Model Deployment
Fine-Tuning
RAG Pipelines
Self-Hosted Models
By Technspire Team