Skip to content

Commit e3443a9

Browse files
Merge pull request #762 from MicrosoftDocs/main
Merging from main to live
2 parents 30f03c6 + 433d458 commit e3443a9

7 files changed

Lines changed: 207 additions & 25 deletions

File tree

agent-framework/tutorials/agents/enable-observability.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -162,7 +162,7 @@ pip install azure-monitor-opentelemetry
162162

163163
## Enable OpenTelemetry in your app
164164

165-
Agent Frameworkagent framework provides a convenient `setup_observability` function that configures OpenTelemetry with sensible defaults.
165+
Agent Framework provides a convenient `setup_observability` function that configures OpenTelemetry with sensible defaults.
166166
By default, it exports to the console if no specific exporter is configured.
167167

168168
```python

agent-framework/user-guide/agents/TOC.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,8 @@
88
href: multi-turn-conversation.md
99
- name: Agent Middleware
1010
href: agent-middleware.md
11+
- name: Agent Retrieval Augmented Generation (RAG)
12+
href: agent-rag.md
1113
- name: Agent Memory
1214
href: agent-memory.md
1315
- name: Agent Observability

agent-framework/user-guide/agents/agent-memory.md

Lines changed: 18 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -15,21 +15,22 @@ Agent memory is a crucial capability that allows agents to maintain context acro
1515

1616
::: zone pivot="programming-language-csharp"
1717

18-
## Memory Types
19-
2018
The Agent Framework supports several types of memory to accommodate different use cases, including managing chat history as part of short term memory and providing extension points for extracting, storing and injecting long term memories into agents.
2119

22-
### Chat History (short term memory)
20+
## Chat History (short term memory)
2321

2422
Various chat history storage options are supported by the Agent Framework. The available options vary by agent type and the underlying service(s) used to build the agent.
2523

26-
E.g. where an agent is built using a service that only supports storage of chat history in the service, the Agent Framework must respect what the service requires.
24+
Here are the two main scenarios supported:
2725

28-
#### In-memory chat history storage
26+
1. **In-memory storage**: Agent is built on a service that does not support in-service storage of chat history (e.g. OpenAI Chat Completion). The Agent Framework will by default store the full chat history in-memory in the `AgentThread` object, but developers can provide a custom `ChatMessageStore` implementation to store chat history in a 3rd party store if required.
27+
1. **In-service storage**: Agent is built on a service that requires in-service storage of chat history (e.g. Azure AI Foundry Persistent Agents). The Agent Framework will store the id of the remote chat history in the `AgentThread` object and no other chat history storage options are supported.
2928

30-
When using a service that does not support in-service storage of chat history, the Agent Framework will store chat history in-memory in the `AgentThread` object by default. In this case the full chat history stored in the thread object, plus any new messages, will be provided to the underlying service on each agent run.
29+
### In-memory chat history storage
3130

32-
E.g, when using OpenAI Chat Completion as the underlying service for agents, the following code will result in the thread object containing the chat history from the agent run.
31+
When using a service that does not support in-service storage of chat history, the Agent Framework will default to storing chat history in-memory in the `AgentThread` object. In this case, the full chat history that is stored in the thread object, plus any new messages, will be provided to the underlying service on each agent run. This allows for a natural conversational experience with the agent, where the caller only provides the new user message, and the agent only returns new answers, but the agent has access to the full conversation history and will use it when generating its response.
32+
33+
When using OpenAI Chat Completion as the underlying service for agents, the following code will result in the thread object containing the chat history from the agent run.
3334

3435
```csharp
3536
AIAgent agent = new OpenAIClient("<your_api_key>")
@@ -48,7 +49,7 @@ IList<ChatMessage>? messages = thread.GetService<IList<ChatMessage>>();
4849
> [!NOTE]
4950
> Retrieving messages from the `AgentThread` object in this way will only work if in-memory storage is being used.
5051
51-
##### Chat History reduction with In-Memory storage
52+
#### Chat History reduction with In-Memory storage
5253

5354
The built-in `InMemoryChatMessageStore` that is used by default when the underlying service does not support in-service storage,
5455
can be configured with a reducer to manage the size of the chat history.
@@ -80,9 +81,9 @@ AIAgent agent = new OpenAIClient("<your_api_key>")
8081
> [!NOTE]
8182
> This feature is only supported when using the `InMemoryChatMessageStore`. When a service has in-service chat history storage, it is up to the service itself to manage the size of the chat history. Similarly, when using 3rd party storage (see below), it is up to the 3rd party storage solution to manage the chat history size. If you provide a `ChatMessageStoreFactory` for a message store but you use a service with built-in chat history storage, the factory will not be used.
8283
83-
#### Inference service chat history storage
84+
### Inference service chat history storage
8485

85-
When using a service that requires in-service storage of chat history, the Agent Framework will storage the id of the remote chat history in the `AgentThread` object.
86+
When using a service that requires in-service storage of chat history, the Agent Framework will store the id of the remote chat history in the `AgentThread` object.
8687

8788
E.g, when using OpenAI Responses with store=true as the underlying service for agents, the following code will result in the thread object containing the last response id returned by the service.
8889

@@ -99,7 +100,7 @@ Console.WriteLine(await agent.RunAsync("Tell me a joke about a pirate.", thread)
99100
> Therefore, depending on the mode that the service is used in, the Agent Framework will either default to storing the full chat history in memory, or storing an id reference
100101
> to the service stored chat history.
101102
102-
#### 3rd party chat history storage
103+
### 3rd party chat history storage
103104

104105
When using a service that does not support in-service storage of chat history, the Agent Framework allows developers to replace the default in-memory storage of chat history with 3rd party chat history storage. The developer is required to provide a subclass of the base abstract `ChatMessageStore` class.
105106

@@ -134,7 +135,7 @@ AIAgent agent = new AzureOpenAIClient(
134135
> [!TIP]
135136
> For a detailed example on how to create a custom message store, see the [Storing Chat History in 3rd Party Storage](../../tutorials/agents/third-party-chat-history-storage.md) tutorial.
136137
137-
### Long term memory
138+
## Long term memory
138139

139140
The Agent Framework allows developers to provide custom components that can extract memories or provide memories to an agent.
140141

@@ -143,7 +144,7 @@ To implement such a memory component, the developer needs to subclass the `AICon
143144
> [!TIP]
144145
> For a detailed example on how to create a custom memory component, see the [Adding Memory to an Agent](../../tutorials/agents/memory.md) tutorial.
145146
146-
### AgentThread Serialization
147+
## AgentThread Serialization
147148

148149
It is important to be able to persist an `AgentThread` object between agent invocations. This allows for situations where a user may ask a question of the agent, and take a long time to ask follow up questions. This allows the `AgentThread` state to survive service or app restarts.
149150

@@ -159,6 +160,10 @@ JsonElement serializedThreadState = thread.Serialize();
159160
AgentThread resumedThread = AIAgent.DeserializeThread(serializedThreadState);
160161
```
161162

163+
> [!NOTE]
164+
> `AgentThread` objects may contain more than just chat history, e.g. context providers may also store state in the thread object. Therefore, it is important to always serialize, store and deserialize the entire `AgentThread` object to ensure that all state is preserved.
165+
> [!IMPORTANT]
166+
> Always treat `AgentThread` objects as opaque objects, unless you are very sure of the internals. The contents may vary not just by agent type, but also by service type and configuration.
162167
> [!WARNING]
163168
> Deserializing a thread with a different agent than that which originally created it, or with an agent that has a different configuration than the original agent, may result in errors or unexpected behavior.
164169

agent-framework/user-guide/agents/agent-middleware.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -464,8 +464,7 @@ This middleware approach allows you to implement sophisticated response transfor
464464

465465
::: zone-end
466466

467-
468467
## Next steps
469468

470469
> [!div class="nextstepaction"]
471-
> [Agent Memory](./agent-memory.md)
470+
> [Agent Retrieval Augmented Generation (RAG)](./agent-rag.md)
Lines changed: 97 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,97 @@
1+
---
2+
title: Agent Retrieval Augmented Generation (RAG)
3+
description: Learn how to use Retrieval Augmented Generation (RAG) with Agent Framework
4+
zone_pivot_groups: programming-languages
5+
author: westey-m
6+
ms.topic: reference
7+
ms.author: westey
8+
ms.date: 11/06/2025
9+
ms.service: agent-framework
10+
---
11+
12+
# Agent Retrieval Augmented Generation (RAG)
13+
14+
Microsoft Agent Framework supports adding Retrieval Augmented Generation (RAG) capabilities to agents easily by adding AI Context Providers to the agent.
15+
16+
::: zone pivot="programming-language-csharp"
17+
18+
## Using TextSearchProvider
19+
20+
The `TextSearchProvider` class is an out-of-the-box implementation of a RAG context provider.
21+
22+
It can easily be attached to a `ChatClientAgent` using the `AIContextProviderFactory` option to provide RAG capabilities to the agent.
23+
24+
```csharp
25+
// Create the AI agent with the TextSearchProvider as the AI context provider.
26+
AIAgent agent = azureOpenAIClient
27+
.GetChatClient(deploymentName)
28+
.CreateAIAgent(new ChatClientAgentOptions
29+
{
30+
Instructions = "You are a helpful support specialist for Contoso Outdoors. Answer questions using the provided context and cite the source document when available.",
31+
AIContextProviderFactory = ctx => new TextSearchProvider(SearchAdapter, ctx.SerializedState, ctx.JsonSerializerOptions, textSearchOptions)
32+
});
33+
```
34+
35+
The `TextSearchProvider` requires a function that provides the search results given a query. This can be implemented using any search technology, e.g. Azure AI Search, or a web search engine.
36+
37+
Here is an example of a mock search function that returns pre-defined results based on the query.
38+
`SourceName` and `SourceLink` are optional, but if provided will be used by the agent to cite the source of the information when answering the user's question.
39+
40+
```csharp
41+
static Task<IEnumerable<TextSearchProvider.TextSearchResult>> SearchAdapter(string query, CancellationToken cancellationToken)
42+
{
43+
// The mock search inspects the user's question and returns pre-defined snippets
44+
// that resemble documents stored in an external knowledge source.
45+
List<TextSearchProvider.TextSearchResult> results = new();
46+
47+
if (query.Contains("return", StringComparison.OrdinalIgnoreCase) || query.Contains("refund", StringComparison.OrdinalIgnoreCase))
48+
{
49+
results.Add(new()
50+
{
51+
SourceName = "Contoso Outdoors Return Policy",
52+
SourceLink = "https://contoso.com/policies/returns",
53+
Text = "Customers may return any item within 30 days of delivery. Items should be unused and include original packaging. Refunds are issued to the original payment method within 5 business days of inspection."
54+
});
55+
}
56+
57+
return Task.FromResult<IEnumerable<TextSearchProvider.TextSearchResult>>(results);
58+
}
59+
```
60+
61+
### TextSearchProvider Options
62+
63+
The `TextSearchProvider` can be customized via the `TextSearchProviderOptions` class. Here is an example of creating options to run the search prior to every model invocation and keep a short rolling window of conversation context.
64+
65+
```csharp
66+
TextSearchProviderOptions textSearchOptions = new()
67+
{
68+
// Run the search prior to every model invocation and keep a short rolling window of conversation context.
69+
SearchTime = TextSearchProviderOptions.TextSearchBehavior.BeforeAIInvoke,
70+
RecentMessageMemoryLimit = 6,
71+
};
72+
```
73+
74+
The `TextSearchProvider` class supports the following options via the `TextSearchProviderOptions` class.
75+
76+
| Option | Type | Description | Default |
77+
|--------|------|-------------|---------|
78+
| SearchTime | `TextSearchProviderOptions.TextSearchBehavior` | Indicates when the search should be executed. There are two options, each time the agent is invoked, or on-demand via function calling. | `TextSearchProviderOptions.TextSearchBehavior.BeforeAIInvoke` |
79+
| FunctionToolName | `string` | The name of the exposed search tool when operating in on-demand mode. | "Search" |
80+
| FunctionToolDescription | `string` | The description of the exposed search tool when operating in on-demand mode. | "Allows searching for additional information to help answer the user question." |
81+
| ContextPrompt | `string` | The context prompt prefixed to results when operating in `BeforeAIInvoke` mode. | "## Additional Context\nConsider the following information from source documents when responding to the user:" |
82+
| CitationsPrompt | `string` | The instruction appended after results to request citations when operating in `BeforeAIInvoke` mode. | "Include citations to the source document with document name and link if document name and link is available." |
83+
| ContextFormatter | `Func<IList<TextSearchProvider.TextSearchResult>, string>` | Optional delegate to fully customize formatting of the result list when operating in `BeforeAIInvoke` mode. If provided, `ContextPrompt` and `CitationsPrompt` are ignored. | `null` |
84+
| RecentMessageMemoryLimit | `int` | The number of recent conversation messages (both user and assistant) to keep in memory and include when constructing the search input for `BeforeAIInvoke` searches. | `0` (disabled) |
85+
| RecentMessageRolesIncluded | `List<ChatRole>` | The list of `ChatRole` types to filter recent messages to when deciding which recent messages to include when constructing the search input. | `ChatRole.User` |
86+
87+
::: zone-end
88+
::: zone pivot="programming-language-python"
89+
90+
More info coming soon.
91+
92+
::: zone-end
93+
94+
## Next steps
95+
96+
> [!div class="nextstepaction"]
97+
> [Agent Memory](./agent-memory.md)

agent-framework/user-guide/agents/agent-types/index.md

Lines changed: 76 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -76,6 +76,82 @@ See the documentation for each agent type, for more information:
7676
|---|---|
7777
|[A2A](./a2a-agent.md)|An agent that serves as a proxy to a remote agent via the A2A protocol.|
7878

79+
## Azure and OpenAI SDK Options Reference
80+
81+
When using Azure AI Foundry, Azure OpenAI, or OpenAI services, you have various SDK options to connect to these services. In some cases, it is possible to use multiple SDKs to connect to the same service or to use the same SDK to connect to different services. Here is a list of the different options available with the url that you should use when connecting to each. Make sure to replace `<resource>` and `<project>` with your actual resource and project names.
82+
83+
| AI Service | SDK | Nuget | Url |
84+
|------------------|-----|-------|-----|
85+
| [Azure AI Foundry Models](/azure/ai-foundry/concepts/foundry-models-overview) | Azure OpenAI SDK <sup>2</sup> | [Azure.AI.OpenAI](https://www.nuget.org/packages/Azure.AI.OpenAI) | https://ai-foundry-&lt;resource&gt;.services.ai.azure.com/ |
86+
| [Azure AI Foundry Models](/azure/ai-foundry/concepts/foundry-models-overview) | OpenAI SDK <sup>3</sup> | [OpenAI](https://www.nuget.org/packages/OpenAI) | https://ai-foundry-&lt;resource&gt;.services.ai.azure.com/openai/v1/ |
87+
| [Azure AI Foundry Models](/azure/ai-foundry/concepts/foundry-models-overview) | Azure AI Inference SDK <sup>2</sup> | [Azure.AI.Inference](https://www.nuget.org/packages/Azure.AI.Inference) | https://ai-foundry-&lt;resource&gt;.services.ai.azure.com/models |
88+
| [Azure AI Foundry Agents](/azure/ai-foundry/agents/overview) | Azure AI Persistent Agents SDK | [Azure.AI.Agents.Persistent](https://www.nuget.org/packages/Azure.AI.Agents.Persistent) | https://ai-foundry-&lt;resource&gt;.services.ai.azure.com/api/projects/ai-project-&lt;project&gt; |
89+
| [Azure OpenAI](/azure/ai-foundry/openai/overview) <sup>1</sup> | Azure OpenAI SDK <sup>2</sup> | [Azure.AI.OpenAI](https://www.nuget.org/packages/Azure.AI.OpenAI) | https://&lt;resource&gt;.openai.azure.com/ |
90+
| [Azure OpenAI](/azure/ai-foundry/openai/overview) <sup>1</sup> | OpenAI SDK | [OpenAI](https://www.nuget.org/packages/OpenAI) | https://&lt;resource&gt;.openai.azure.com/openai/v1/ |
91+
| OpenAI | OpenAI SDK | [OpenAI](https://www.nuget.org/packages/OpenAI) | No url required |
92+
93+
1. [Upgrading from Azure OpenAI to Azure AI Foundry](/azure/ai-foundry/how-to/upgrade-azure-openai)
94+
1. We recommend using the OpenAI SDK.
95+
1. While we recommend using the OpenAI SDK to access Azure AI Foundry models, Azure AI Foundry Models support models from many different vendors, not just OpenAI. All these models are supported via the OpenAI SDK.
96+
97+
### Using the OpenAI SDK
98+
99+
As shown in the table above, the OpenAI SDK can be used to connect to multiple services.
100+
Depending on the service you are connecting to, you may need to set a custom URL when creating the `OpenAIClient`.
101+
You can also use different authentication mechanisms depending on the service.
102+
103+
If a custom URL is required (see table above), you can set it via the OpenAIClientOptions.
104+
105+
```csharp
106+
var clientOptions = new OpenAIClientOptions() { Endpoint = new Uri(serviceUrl) };
107+
```
108+
109+
It's possible to use an API key when creating the client.
110+
111+
```csharp
112+
OpenAIClient client = new OpenAIClient(new ApiKeyCredential(apiKey), clientOptions);
113+
```
114+
115+
When using an Azure Service, it's also possible to use Azure credentials instead of an API key.
116+
117+
```csharp
118+
OpenAIClient client = new OpenAIClient(new BearerTokenPolicy(new AzureCliCredential(), "https://ai.azure.com/.default"), clientOptions)
119+
```
120+
121+
Once you have created the OpenAIClient, you can get a sub client for the specific service you want to use and then create an `AIAgent` from that.
122+
123+
```csharp
124+
AIAgent agent = client
125+
.GetChatClient(model)
126+
.CreateAIAgent(instructions: "You are good at telling jokes.", name: "Joker");
127+
```
128+
129+
### Using the Azure OpenAI SDK
130+
131+
This SDK can be used to connect to both Azure OpenAI and Azure AI Foundry Models services.
132+
Either way, you will need to supply the correct service URL when creating the `AzureOpenAIClient`.
133+
See the table above for the correct URL to use.
134+
135+
```csharp
136+
AIAgent agent = new AzureOpenAIClient(
137+
new Uri(serviceUrl),
138+
new AzureCliCredential())
139+
.GetChatClient(deploymentName)
140+
.CreateAIAgent(instructions: "You are good at telling jokes.", name: "Joker");
141+
```
142+
143+
### Using the Azure AI Persistent Agents SDK
144+
145+
This SDK is only supported with the Azure AI Foundry Agents service. See the table above for the correct URL to use.
146+
147+
```csharp
148+
var persistentAgentsClient = new PersistentAgentsClient(serviceUrl, new AzureCliCredential());
149+
AIAgent agent = await persistentAgentsClient.CreateAIAgentAsync(
150+
model: deploymentName,
151+
name: "Joker",
152+
instructions: "You are good at telling jokes.");
153+
```
154+
79155
::: zone-end
80156

81157
::: zone pivot="programming-language-python"

0 commit comments

Comments
 (0)