Skip to content
Learn Agentic AI13 min read0 views

Semantic Kernel: Microsoft's Enterprise Agent Framework for .NET and Python

Learn how Semantic Kernel brings enterprise-grade agent capabilities to .NET and Python applications with planners, plugins, memory integration, and deep Azure ecosystem support.

Enterprise AI Needs a Different Framework

Most agent frameworks target Python-first startups building experimental AI products. Semantic Kernel targets a different audience: enterprise engineering teams building AI features into existing .NET and Python applications. Developed by Microsoft, it integrates deeply with the Azure ecosystem while remaining open-source and provider-agnostic.

The framework is designed for environments where you need to add AI capabilities to existing business applications — not build standalone AI agents from scratch.

Core Architecture

Semantic Kernel is organized around a kernel object that acts as a dependency injection container for AI services, plugins, and memory. You configure the kernel with the services you need, register plugins that provide capabilities, and then use the kernel to orchestrate AI interactions.

from semantic_kernel import Kernel
from semantic_kernel.connectors.ai.open_ai import (
    AzureChatCompletion,
    OpenAIChatCompletion,
)

# Create a kernel and register an AI service
kernel = Kernel()
kernel.add_service(
    OpenAIChatCompletion(
        service_id="chat",
        ai_model_id="gpt-4o",
    )
)

In .NET, the same concept uses familiar dependency injection patterns:

// C# version using builder pattern
var builder = Kernel.CreateBuilder();
builder.AddAzureOpenAIChatCompletion(
    deploymentName: "gpt-4o",
    endpoint: azureEndpoint,
    apiKey: azureApiKey
);
var kernel = builder.Build();

Plugins: The Building Blocks

In Semantic Kernel, capabilities are organized as plugins — collections of related functions that the AI can call. Each plugin groups related tools under a namespace:

from semantic_kernel.functions import kernel_function

class WeatherPlugin:
    @kernel_function(
        name="get_forecast",
        description="Get the weather forecast for a city"
    )
    def get_forecast(self, city: str) -> str:
        return f"The forecast for {city}: 72°F, partly cloudy"

    @kernel_function(
        name="get_alerts",
        description="Get active weather alerts for a region"
    )
    def get_alerts(self, region: str) -> str:
        return f"No active alerts for {region}"

# Register the plugin
kernel.add_plugin(WeatherPlugin(), plugin_name="Weather")

Plugins can also be defined inline using prompt templates — what Semantic Kernel calls semantic functions:

from semantic_kernel.prompt_template import PromptTemplateConfig

summarize_config = PromptTemplateConfig(
    template="Summarize the following text in {{$style}} style: {{$input}}",
    input_variables=[
        {"name": "input", "description": "Text to summarize"},
        {"name": "style", "description": "Writing style", "default": "concise"},
    ],
)

kernel.add_function(
    plugin_name="Text",
    function_name="summarize",
    prompt_template_config=summarize_config,
)

Planners: Automatic Orchestration

Planners are Semantic Kernel's mechanism for automatically chaining plugin functions to accomplish a goal. Instead of manually defining the sequence of tool calls, you describe the goal and the planner figures out which plugins to invoke and in what order:

See AI Voice Agents Handle Real Calls

Book a free demo or calculate how much you can save with AI voice automation.

from semantic_kernel.planners import FunctionCallingStepwisePlanner

planner = FunctionCallingStepwisePlanner(
    service_id="chat",
    max_iterations=10,
)

result = await planner.invoke(
    kernel,
    question="What is the weather forecast for Seattle, and summarize it in a tweet-length message?"
)
print(result.final_answer)

The planner sees all registered plugins, determines it needs to call Weather.get_forecast first, then Text.summarize with a tweet-length style, and chains them together. This is effectively automatic agent behavior without writing explicit orchestration logic.

Memory Integration

Semantic Kernel has first-class support for memory — both short-term conversation history and long-term vector-based memory:

from semantic_kernel.memory import SemanticTextMemory
from semantic_kernel.connectors.memory.azure_cognitive_search import (
    AzureCognitiveSearchMemoryStore,
)

memory_store = AzureCognitiveSearchMemoryStore(
    endpoint=azure_search_endpoint,
    admin_key=azure_search_key,
)
memory = SemanticTextMemory(storage=memory_store, embeddings_generator=embedding_service)

# Save information to memory
await memory.save_information(
    collection="company_knowledge",
    id="policy_1",
    text="Remote employees must be available during core hours 10am-3pm EST",
)

# Recall relevant information
results = await memory.search(
    collection="company_knowledge",
    query="What are the remote work hours?",
    limit=3,
)

Memory integrates with Azure Cognitive Search, Qdrant, Pinecone, and other vector stores. This makes it straightforward to build agents that reference organizational knowledge.

Enterprise Integration Strengths

Semantic Kernel's real advantage is enterprise integration. It supports Azure Active Directory for authentication, Azure Key Vault for secrets, Application Insights for telemetry, and Azure AI Search for retrieval. If your organization runs on Azure, Semantic Kernel fits naturally into the existing infrastructure.

The .NET-first design also matters. Many enterprise codebases are C# — Semantic Kernel lets these teams add AI capabilities without rewriting in Python.

FAQ

Can Semantic Kernel work with non-Azure LLM providers?

Yes. Semantic Kernel supports OpenAI directly, Hugging Face models, and has a growing list of community connectors. The Azure integration is a strength, not a requirement.

How does Semantic Kernel compare to LangChain?

LangChain is Python-first and broader in scope. Semantic Kernel is cross-platform (.NET and Python), more opinionated about plugin architecture, and designed for integration into existing enterprise applications rather than building standalone AI tools.

Is the planner reliable enough for production use?

The FunctionCallingStepwisePlanner is production-ready for well-scoped tasks where the available plugins clearly map to the goal. For complex, ambiguous goals, you may want to define explicit orchestration rather than relying on automatic planning.


#SemanticKernel #Microsoft #EnterpriseAI #NET #Python #AgenticAI #LearnAI #AIEngineering

Share this article
C

CallSphere Team

Expert insights on AI voice agents and customer communication automation.

Try CallSphere AI Voice Agents

See how AI voice agents work for your industry. Live demo available -- no signup required.