You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: agent-framework/user-guide/agents/agent-memory.md
+18-13Lines changed: 18 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -15,21 +15,22 @@ Agent memory is a crucial capability that allows agents to maintain context acro
15
15
16
16
::: zone pivot="programming-language-csharp"
17
17
18
-
## Memory Types
19
-
20
18
The Agent Framework supports several types of memory to accommodate different use cases, including managing chat history as part of short term memory and providing extension points for extracting, storing and injecting long term memories into agents.
21
19
22
-
###Chat History (short term memory)
20
+
## Chat History (short term memory)
23
21
24
22
Various chat history storage options are supported by the Agent Framework. The available options vary by agent type and the underlying service(s) used to build the agent.
25
23
26
-
E.g. where an agent is built using a service that only supports storage of chat history in the service, the Agent Framework must respect what the service requires.
24
+
Here are the two main scenarios supported:
27
25
28
-
#### In-memory chat history storage
26
+
1.**In-memory storage**: Agent is built on a service that does not support in-service storage of chat history (e.g. OpenAI Chat Completion). The Agent Framework will by default store the full chat history in-memory in the `AgentThread` object, but developers can provide a custom `ChatMessageStore` implementation to store chat history in a 3rd party store if required.
27
+
1.**In-service storage**: Agent is built on a service that requires in-service storage of chat history (e.g. Azure AI Foundry Persistent Agents). The Agent Framework will store the id of the remote chat history in the `AgentThread` object and no other chat history storage options are supported.
29
28
30
-
When using a service that does not support in-service storage of chat history, the Agent Framework will store chat history in-memory in the `AgentThread` object by default. In this case the full chat history stored in the thread object, plus any new messages, will be provided to the underlying service on each agent run.
29
+
### In-memory chat history storage
31
30
32
-
E.g, when using OpenAI Chat Completion as the underlying service for agents, the following code will result in the thread object containing the chat history from the agent run.
31
+
When using a service that does not support in-service storage of chat history, the Agent Framework will default to storing chat history in-memory in the `AgentThread` object. In this case, the full chat history that is stored in the thread object, plus any new messages, will be provided to the underlying service on each agent run. This allows for a natural conversational experience with the agent, where the caller only provides the new user message, and the agent only returns new answers, but the agent has access to the full conversation history and will use it when generating its response.
32
+
33
+
When using OpenAI Chat Completion as the underlying service for agents, the following code will result in the thread object containing the chat history from the agent run.
> Retrieving messages from the `AgentThread` object in this way will only work if in-memory storage is being used.
50
51
51
-
#####Chat History reduction with In-Memory storage
52
+
#### Chat History reduction with In-Memory storage
52
53
53
54
The built-in `InMemoryChatMessageStore` that is used by default when the underlying service does not support in-service storage,
54
55
can be configured with a reducer to manage the size of the chat history.
@@ -80,9 +81,9 @@ AIAgent agent = new OpenAIClient("<your_api_key>")
80
81
> [!NOTE]
81
82
> This feature is only supported when using the `InMemoryChatMessageStore`. When a service has in-service chat history storage, it is up to the service itself to manage the size of the chat history. Similarly, when using 3rd party storage (see below), it is up to the 3rd party storage solution to manage the chat history size. If you provide a `ChatMessageStoreFactory` for a message store but you use a service with built-in chat history storage, the factory will not be used.
82
83
83
-
####Inference service chat history storage
84
+
### Inference service chat history storage
84
85
85
-
When using a service that requires in-service storage of chat history, the Agent Framework will storage the id of the remote chat history in the `AgentThread` object.
86
+
When using a service that requires in-service storage of chat history, the Agent Framework will store the id of the remote chat history in the `AgentThread` object.
86
87
87
88
E.g, when using OpenAI Responses with store=true as the underlying service for agents, the following code will result in the thread object containing the last response id returned by the service.
88
89
@@ -99,7 +100,7 @@ Console.WriteLine(await agent.RunAsync("Tell me a joke about a pirate.", thread)
99
100
> Therefore, depending on the mode that the service is used in, the Agent Framework will either default to storing the full chat history in memory, or storing an id reference
100
101
> to the service stored chat history.
101
102
102
-
####3rd party chat history storage
103
+
### 3rd party chat history storage
103
104
104
105
When using a service that does not support in-service storage of chat history, the Agent Framework allows developers to replace the default in-memory storage of chat history with 3rd party chat history storage. The developer is required to provide a subclass of the base abstract `ChatMessageStore` class.
105
106
@@ -134,7 +135,7 @@ AIAgent agent = new AzureOpenAIClient(
134
135
> [!TIP]
135
136
> For a detailed example on how to create a custom message store, see the [Storing Chat History in 3rd Party Storage](../../tutorials/agents/third-party-chat-history-storage.md) tutorial.
136
137
137
-
###Long term memory
138
+
## Long term memory
138
139
139
140
The Agent Framework allows developers to provide custom components that can extract memories or provide memories to an agent.
140
141
@@ -143,7 +144,7 @@ To implement such a memory component, the developer needs to subclass the `AICon
143
144
> [!TIP]
144
145
> For a detailed example on how to create a custom memory component, see the [Adding Memory to an Agent](../../tutorials/agents/memory.md) tutorial.
145
146
146
-
###AgentThread Serialization
147
+
## AgentThread Serialization
147
148
148
149
It is important to be able to persist an `AgentThread` object between agent invocations. This allows for situations where a user may ask a question of the agent, and take a long time to ask follow up questions. This allows the `AgentThread` state to survive service or app restarts.
> `AgentThread` objects may contain more than just chat history, e.g. context providers may also store state in the thread object. Therefore, it is important to always serialize, store and deserialize the entire `AgentThread` object to ensure that all state is preserved.
165
+
> [!IMPORTANT]
166
+
> Always treat `AgentThread` objects as opaque objects, unless you are very sure of the internals. The contents may vary not just by agent type, but also by service type and configuration.
162
167
> [!WARNING]
163
168
> Deserializing a thread with a different agent than that which originally created it, or with an agent that has a different configuration than the original agent, may result in errors or unexpected behavior.
description: Learn how to use Retrieval Augmented Generation (RAG) with Agent Framework
4
+
zone_pivot_groups: programming-languages
5
+
author: westey-m
6
+
ms.topic: reference
7
+
ms.author: westey
8
+
ms.date: 11/06/2025
9
+
ms.service: agent-framework
10
+
---
11
+
12
+
# Agent Retrieval Augmented Generation (RAG)
13
+
14
+
Microsoft Agent Framework supports adding Retrieval Augmented Generation (RAG) capabilities to agents easily by adding AI Context Providers to the agent.
15
+
16
+
::: zone pivot="programming-language-csharp"
17
+
18
+
## Using TextSearchProvider
19
+
20
+
The `TextSearchProvider` class is an out-of-the-box implementation of a RAG context provider.
21
+
22
+
It can easily be attached to a `ChatClientAgent` using the `AIContextProviderFactory` option to provide RAG capabilities to the agent.
23
+
24
+
```csharp
25
+
// Create the AI agent with the TextSearchProvider as the AI context provider.
26
+
AIAgentagent=azureOpenAIClient
27
+
.GetChatClient(deploymentName)
28
+
.CreateAIAgent(newChatClientAgentOptions
29
+
{
30
+
Instructions="You are a helpful support specialist for Contoso Outdoors. Answer questions using the provided context and cite the source document when available.",
The `TextSearchProvider` requires a function that provides the search results given a query. This can be implemented using any search technology, e.g. Azure AI Search, or a web search engine.
36
+
37
+
Here is an example of a mock search function that returns pre-defined results based on the query.
38
+
`SourceName` and `SourceLink` are optional, but if provided will be used by the agent to cite the source of the information when answering the user's question.
Text="Customers may return any item within 30 days of delivery. Items should be unused and include original packaging. Refunds are issued to the original payment method within 5 business days of inspection."
The `TextSearchProvider` can be customized via the `TextSearchProviderOptions` class. Here is an example of creating options to run the search prior to every model invocation and keep a short rolling window of conversation context.
64
+
65
+
```csharp
66
+
TextSearchProviderOptionstextSearchOptions=new()
67
+
{
68
+
// Run the search prior to every model invocation and keep a short rolling window of conversation context.
The `TextSearchProvider` class supports the following options via the `TextSearchProviderOptions` class.
75
+
76
+
| Option | Type | Description | Default |
77
+
|--------|------|-------------|---------|
78
+
| SearchTime |`TextSearchProviderOptions.TextSearchBehavior`| Indicates when the search should be executed. There are two options, each time the agent is invoked, or on-demand via function calling. |`TextSearchProviderOptions.TextSearchBehavior.BeforeAIInvoke`|
79
+
| FunctionToolName |`string`| The name of the exposed search tool when operating in on-demand mode. | "Search" |
80
+
| FunctionToolDescription |`string`| The description of the exposed search tool when operating in on-demand mode. | "Allows searching for additional information to help answer the user question." |
81
+
| ContextPrompt |`string`| The context prompt prefixed to results when operating in `BeforeAIInvoke` mode. | "## Additional Context\nConsider the following information from source documents when responding to the user:" |
82
+
| CitationsPrompt |`string`| The instruction appended after results to request citations when operating in `BeforeAIInvoke` mode. | "Include citations to the source document with document name and link if document name and link is available." |
83
+
| ContextFormatter |`Func<IList<TextSearchProvider.TextSearchResult>, string>`| Optional delegate to fully customize formatting of the result list when operating in `BeforeAIInvoke` mode. If provided, `ContextPrompt` and `CitationsPrompt` are ignored. |`null`|
84
+
| RecentMessageMemoryLimit |`int`| The number of recent conversation messages (both user and assistant) to keep in memory and include when constructing the search input for `BeforeAIInvoke` searches. |`0` (disabled) |
85
+
| RecentMessageRolesIncluded |`List<ChatRole>`| The list of `ChatRole` types to filter recent messages to when deciding which recent messages to include when constructing the search input. |`ChatRole.User`|
Copy file name to clipboardExpand all lines: agent-framework/user-guide/agents/agent-types/index.md
+76Lines changed: 76 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -76,6 +76,82 @@ See the documentation for each agent type, for more information:
76
76
|---|---|
77
77
|[A2A](./a2a-agent.md)|An agent that serves as a proxy to a remote agent via the A2A protocol.|
78
78
79
+
## Azure and OpenAI SDK Options Reference
80
+
81
+
When using Azure AI Foundry, Azure OpenAI, or OpenAI services, you have various SDK options to connect to these services. In some cases, it is possible to use multiple SDKs to connect to the same service or to use the same SDK to connect to different services. Here is a list of the different options available with the url that you should use when connecting to each. Make sure to replace `<resource>` and `<project>` with your actual resource and project names.
82
+
83
+
| AI Service | SDK | Nuget | Url |
84
+
|------------------|-----|-------|-----|
85
+
|[Azure AI Foundry Models](/azure/ai-foundry/concepts/foundry-models-overview)| Azure OpenAI SDK <sup>2</sup> |[Azure.AI.OpenAI](https://www.nuget.org/packages/Azure.AI.OpenAI)|https://ai-foundry-<resource>.services.ai.azure.com/|
86
+
|[Azure AI Foundry Models](/azure/ai-foundry/concepts/foundry-models-overview)| OpenAI SDK <sup>3</sup> |[OpenAI](https://www.nuget.org/packages/OpenAI)|https://ai-foundry-<resource>.services.ai.azure.com/openai/v1/|
87
+
|[Azure AI Foundry Models](/azure/ai-foundry/concepts/foundry-models-overview)| Azure AI Inference SDK <sup>2</sup> |[Azure.AI.Inference](https://www.nuget.org/packages/Azure.AI.Inference)|https://ai-foundry-<resource>.services.ai.azure.com/models|
88
+
|[Azure AI Foundry Agents](/azure/ai-foundry/agents/overview)| Azure AI Persistent Agents SDK |[Azure.AI.Agents.Persistent](https://www.nuget.org/packages/Azure.AI.Agents.Persistent)|https://ai-foundry-<resource>.services.ai.azure.com/api/projects/ai-project-<project>|
| OpenAI | OpenAI SDK |[OpenAI](https://www.nuget.org/packages/OpenAI)| No url required |
92
+
93
+
1.[Upgrading from Azure OpenAI to Azure AI Foundry](/azure/ai-foundry/how-to/upgrade-azure-openai)
94
+
1. We recommend using the OpenAI SDK.
95
+
1. While we recommend using the OpenAI SDK to access Azure AI Foundry models, Azure AI Foundry Models support models from many different vendors, not just OpenAI. All these models are supported via the OpenAI SDK.
96
+
97
+
### Using the OpenAI SDK
98
+
99
+
As shown in the table above, the OpenAI SDK can be used to connect to multiple services.
100
+
Depending on the service you are connecting to, you may need to set a custom URL when creating the `OpenAIClient`.
101
+
You can also use different authentication mechanisms depending on the service.
102
+
103
+
If a custom URL is required (see table above), you can set it via the OpenAIClientOptions.
0 commit comments