Kubernetes ConfigMaps and Secrets for AI Agent Configuration
Learn how to manage AI agent configuration with Kubernetes ConfigMaps and Secrets — including environment injection, volume mounts, secret rotation, and best practices for API key management.
The Configuration Challenge for AI Agents
AI agents need extensive configuration: LLM API keys, model names, temperature settings, tool endpoint URLs, database credentials, rate limits, and prompt templates. Hardcoding any of these into your container image creates a rigid, insecure deployment. Kubernetes solves this with two resources — ConfigMaps for non-sensitive data and Secrets for credentials.
ConfigMaps: Non-Sensitive Configuration
A ConfigMap stores key-value pairs or entire files that Pods consume as environment variables or mounted volumes.
# ai-agent-config.yaml
apiVersion: v1
kind: ConfigMap
metadata:
name: ai-agent-config
namespace: ai-agents
data:
# Key-value pairs
MODEL_NAME: "gpt-4o"
TEMPERATURE: "0.7"
MAX_TOKENS: "4096"
LOG_LEVEL: "INFO"
TOOL_TIMEOUT_SECONDS: "30"
# Multi-line prompt template
system_prompt.txt: |
You are a helpful AI assistant for customer support.
Always be polite and professional.
If you cannot answer a question, escalate to a human agent.
Never disclose internal system details.
Apply it to your cluster:
kubectl apply -f ai-agent-config.yaml
Injecting ConfigMaps as Environment Variables
Reference ConfigMap values in your Deployment spec:
apiVersion: apps/v1
kind: Deployment
metadata:
name: ai-agent
namespace: ai-agents
spec:
replicas: 2
selector:
matchLabels:
app: ai-agent
template:
metadata:
labels:
app: ai-agent
spec:
containers:
- name: agent
image: myregistry/ai-agent:1.0.0
envFrom:
- configMapRef:
name: ai-agent-config
volumeMounts:
- name: prompt-volume
mountPath: /app/prompts
readOnly: true
volumes:
- name: prompt-volume
configMap:
name: ai-agent-config
items:
- key: system_prompt.txt
path: system_prompt.txt
The envFrom directive injects all key-value pairs as environment variables. The volume mount makes the prompt template available as a file at /app/prompts/system_prompt.txt.
Secrets: Sensitive Credentials
Secrets are structurally similar to ConfigMaps but are base64-encoded and have tighter RBAC controls. Use them for API keys, database passwords, and tokens.
See AI Voice Agents Handle Real Calls
Book a free demo or calculate how much you can save with AI voice automation.
# ai-agent-secrets.yaml
apiVersion: v1
kind: Secret
metadata:
name: ai-agent-secrets
namespace: ai-agents
type: Opaque
stringData:
OPENAI_API_KEY: "sk-proj-your-key-here"
DATABASE_URL: "postgresql://agent:password@db-host:5432/agents"
REDIS_URL: "redis://:secret@redis-host:6379/0"
Reference Secrets the same way as ConfigMaps:
containers:
- name: agent
image: myregistry/ai-agent:1.0.0
envFrom:
- configMapRef:
name: ai-agent-config
- secretRef:
name: ai-agent-secrets
Reading Configuration in Python
Your agent code reads configuration through standard environment variables and file reads:
import os
from pathlib import Path
class AgentConfig:
model_name: str = os.environ.get("MODEL_NAME", "gpt-4o")
temperature: float = float(os.environ.get("TEMPERATURE", "0.7"))
max_tokens: int = int(os.environ.get("MAX_TOKENS", "4096"))
openai_api_key: str = os.environ["OPENAI_API_KEY"]
database_url: str = os.environ["DATABASE_URL"]
@staticmethod
def load_system_prompt() -> str:
prompt_path = Path("/app/prompts/system_prompt.txt")
return prompt_path.read_text()
Secret Rotation Without Downtime
When you need to rotate an API key, update the Secret and trigger a rolling restart:
# Update the secret
kubectl create secret generic ai-agent-secrets \
--from-literal=OPENAI_API_KEY="sk-proj-new-key" \
--from-literal=DATABASE_URL="postgresql://agent:newpass@db-host:5432/agents" \
--from-literal=REDIS_URL="redis://:newsecret@redis-host:6379/0" \
--namespace=ai-agents \
--dry-run=client -o yaml | kubectl apply -f -
# Restart Pods to pick up new values
kubectl rollout restart deployment/ai-agent -n ai-agents
For zero-downtime rotation, mount Secrets as volumes instead of environment variables. Kubelet updates mounted Secret files automatically without requiring a Pod restart.
FAQ
Should I use environment variables or volume mounts for AI agent configuration?
Use environment variables for simple key-value settings like model names, temperatures, and API keys. Use volume mounts for larger content like prompt templates, tool schemas, or configuration files. Volume-mounted Secrets have the advantage of automatic updates without Pod restarts, which is valuable for key rotation.
Are Kubernetes Secrets truly secure?
By default, Secrets are stored unencrypted in etcd. Enable encryption at rest in your cluster configuration to protect them. For production AI agent deployments, consider using a secrets management tool like HashiCorp Vault or AWS Secrets Manager with the External Secrets Operator, which syncs external secrets into Kubernetes Secret resources automatically.
How do I manage different configurations across development, staging, and production?
Use Kustomize overlays or Helm values files. Create a base ConfigMap with shared settings and environment-specific overlays that override values like model names, rate limits, and log levels. This lets you run a cheaper model in development while using the full model in production without changing any application code.
#Kubernetes #ConfigurationManagement #Secrets #AIDeployment #Security #AgenticAI #LearnAI #AIEngineering
CallSphere Team
Expert insights on AI voice agents and customer communication automation.
Try CallSphere AI Voice Agents
See how AI voice agents work for your industry. Live demo available -- no signup required.