Skip to content

Commit a572982

Browse files
Merge pull request #800 from eavanvalkenburg/update_obser
2 parents 2985e6f + 12267dc commit a572982

13 files changed

Lines changed: 326 additions & 299 deletions

.openpublishing.redirection.json

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -829,6 +829,11 @@
829829
"source_path": "agent-framework/tutorials/workflows/visualization.md",
830830
"redirect_url": "/agent-framework/user-guide/workflows/visualization",
831831
"redirect_document_id": true
832+
},
833+
{
834+
"source_path": "agent-framework/user-guide/agents/agent-observability.md",
835+
"redirect_url": "/agent-framework/user-guide/observability",
836+
"redirect_document_id": true
832837
}
833838
]
834839
}

agent-framework/TOC.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -20,6 +20,8 @@ items:
2020
href: user-guide/hosting/TOC.yml
2121
- name: DevUI
2222
href: user-guide/devui/TOC.yml
23+
- name: Observability
24+
href: user-guide/observability.md
2325
- name: Integrations
2426
items:
2527
- name: AG-UI

agent-framework/tutorials/agents/enable-observability.md

Lines changed: 62 additions & 163 deletions
Original file line numberDiff line numberDiff line change
@@ -131,142 +131,60 @@ Because he wanted to improve his "arrr-ticulation"! ?????
131131
::: zone-end
132132
::: zone pivot="programming-language-python"
133133

134-
This tutorial shows how to enable OpenTelemetry on an agent so that interactions with the agent are automatically logged and exported.
135-
In this tutorial, output is written to the console using the OpenTelemetry console exporter.
134+
This tutorial shows how to quickly enable OpenTelemetry on an agent so that interactions with the agent are automatically logged and exported.
135+
136+
For comprehensive documentation on observability including all configuration options, environment variables, and advanced scenarios, see the [Observability user guide](../../user-guide/observability.md).
136137

137138
## Prerequisites
138139

139140
For prerequisites, see the [Create and run a simple agent](./run-agent.md) step in this tutorial.
140141

141142
## Install packages
142143

143-
To use Agent Framework with Azure OpenAI, you need to install the following packages. Agent Framework automatically includes all necessary OpenTelemetry dependencies:
144+
To use Agent Framework with OpenTelemetry, install the framework:
144145

145146
```bash
146147
pip install agent-framework --pre
147148
```
148149

149-
The following OpenTelemetry packages are included by default:
150-
```text
151-
opentelemetry-api
152-
opentelemetry-sdk
153-
opentelemetry-exporter-otlp-proto-grpc
154-
opentelemetry-semantic-conventions-ai
155-
```
156-
157-
If you want to export to Azure Monitor (Application Insights), you also need to install the `azure-monitor-opentelemetry` package:
158-
159-
```bash
160-
pip install azure-monitor-opentelemetry
161-
```
150+
For console output during development, no additional packages are needed. For other exporters, see the [Dependencies section](../../user-guide/observability.md#dependencies) in the user guide.
162151

163152
## Enable OpenTelemetry in your app
164153

165-
Agent Framework provides a convenient `setup_observability` function that configures OpenTelemetry with sensible defaults.
166-
By default, it exports to the console if no specific exporter is configured.
154+
The simplest way to enable observability is using `configure_otel_providers()`:
167155

168156
```python
169-
import asyncio
170-
from agent_framework.observability import setup_observability
157+
from agent_framework.observability import configure_otel_providers
171158

172-
# Enable Agent Framework telemetry with console output (default behavior)
173-
setup_observability(enable_sensitive_data=True)
159+
# Enable console output for local development
160+
configure_otel_providers(enable_console_exporters=True)
174161
```
175162

176-
### Understanding `setup_observability` parameters
177-
178-
The `setup_observability` function accepts the following parameters to customize your observability configuration:
179-
180-
- **`enable_otel`** (bool, optional): Enables OpenTelemetry tracing and metrics. Default is `False` when using environment variables only, but is assumed `True` when calling `setup_observability()` programmatically. When using environment variables, set `ENABLE_OTEL=true`.
181-
182-
- **`enable_sensitive_data`** (bool, optional): Controls whether sensitive data like prompts, responses, function call arguments, and results are included in traces. Default is `False`. Set to `True` to see actual prompts and responses in your traces. **Warning**: Be careful with this setting as it might expose sensitive data in your logs. Can also be set via `ENABLE_SENSITIVE_DATA=true` environment variable.
183-
184-
- **`otlp_endpoint`** (str, optional): The OTLP endpoint URL for exporting telemetry data. Default is `None`. Commonly set to `http://localhost:4317`. This creates an OTLPExporter for spans, metrics, and logs. Can be used with any OTLP-compliant endpoint such as [OpenTelemetry Collector](https://opentelemetry.io/docs/collector/), [Aspire Dashboard](/dotnet/aspire/fundamentals/dashboard/overview?tabs=bash), or other OTLP endpoints. Can also be set via `OTLP_ENDPOINT` environment variable.
185-
186-
- **`applicationinsights_connection_string`** (str, optional): Azure Application Insights connection string for exporting to Azure Monitor. Default is `None`. Creates AzureMonitorTraceExporter, AzureMonitorMetricExporter, and AzureMonitorLogExporter. You can find this connection string in the Azure portal under the "Overview" section of your Application Insights resource. Can also be set via `APPLICATIONINSIGHTS_CONNECTION_STRING` environment variable. Requires installation of the `azure-monitor-opentelemetry` package.
163+
Or use environment variables for more flexibility:
187164

188-
- **`vs_code_extension_port`** (int, optional): Port number for the AI Toolkit or Azure AI Foundry VS Code extension. Default is `4317`. Allows integration with VS Code extensions for local development and debugging. Can also be set via `VS_CODE_EXTENSION_PORT` environment variable.
189-
190-
- **`exporters`** (list, optional): Custom list of OpenTelemetry exporters for advanced scenarios. Default is `None`. Allows you to provide your own configured exporters when the standard options don't meet your needs.
191-
192-
> [!IMPORTANT]
193-
> When no exporters (either through parameters or environment variables or as explicit exporters) are provided, the console exporter is configured by default for local debugging.
194-
195-
### Setup options
196-
197-
You can configure observability in three ways:
198-
199-
**1. Environment variables** (simplest approach):
200165
```bash
201-
export ENABLE_OTEL=true
202-
export ENABLE_SENSITIVE_DATA=true
203-
export OTLP_ENDPOINT=http://localhost:4317
166+
export ENABLE_INSTRUMENTATION=true
167+
export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
204168
```
205169

206-
Then in your code:
207170
```python
208-
from agent_framework.observability import setup_observability
209-
210-
setup_observability() # Reads from environment variables
211-
```
212-
213-
**2. Programmatic configuration**:
214-
```python
215-
from agent_framework.observability import setup_observability
216-
217-
# note that ENABLE_OTEL is implied to be True when calling setup_observability programmatically
218-
setup_observability(
219-
enable_sensitive_data=True,
220-
otlp_endpoint="http://localhost:4317",
221-
applicationinsights_connection_string="InstrumentationKey=your_key"
222-
)
223-
```
224-
225-
**3. Custom exporters** (for advanced scenarios):
226-
```python
227-
from agent_framework.observability import setup_observability
228-
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
229-
from opentelemetry.sdk.trace.export import ConsoleSpanExporter
230-
231-
custom_exporters = [
232-
OTLPSpanExporter(endpoint="http://localhost:4317"),
233-
ConsoleSpanExporter()
234-
]
171+
from agent_framework.observability import configure_otel_providers
235172

236-
setup_observability(exporters=custom_exporters, enable_sensitive_data=True)
237-
```
238-
239-
The `setup_observability` function sets the global tracer provider and meter provider, allowing you to create custom spans and metrics:
240-
241-
```python
242-
from agent_framework.observability import get_tracer, get_meter
243-
244-
tracer = get_tracer()
245-
meter = get_meter()
246-
247-
with tracer.start_as_current_span("my_custom_span"):
248-
# Your code here
249-
pass
250-
251-
counter = meter.create_counter("my_custom_counter")
252-
counter.add(1, {"key": "value"})
173+
# Reads OTEL_EXPORTER_OTLP_* environment variables automatically
174+
configure_otel_providers()
253175
```
254176

255177
## Create and run the agent
256178

257-
Create an agent using Agent Framework. The observability will be automatically enabled for the agent once `setup_observability` has been called.
179+
Create an agent using Agent Framework. Observability is automatically enabled once `configure_otel_providers()` has been called.
258180

259181
```python
260182
from agent_framework import ChatAgent
261-
from agent_framework.azure import AzureOpenAIChatClient
262-
from azure.identity import AzureCliCredential
183+
from agent_framework.openai import OpenAIChatClient
263184

264185
# Create the agent - telemetry is automatically enabled
265186
agent = ChatAgent(
266-
chat_client=AzureOpenAIChatClient(
267-
credential=AzureCliCredential(),
268-
model="gpt-4o-mini"
269-
),
187+
chat_client=OpenAIChatClient(),
270188
name="Joker",
271189
instructions="You are good at telling jokes."
272190
)
@@ -276,100 +194,81 @@ result = await agent.run("Tell me a joke about a pirate.")
276194
print(result.text)
277195
```
278196

279-
The console exporter will show trace data on the console similar to the following:
197+
The console exporter will show trace data similar to:
280198

281199
```text
282200
{
283201
"name": "invoke_agent Joker",
284202
"context": {
285203
"trace_id": "0xf2258b51421fe9cf4c0bd428c87b1ae4",
286-
"span_id": "0x2cad6fc139dcf01d",
287-
"trace_state": "[]"
288-
},
289-
"kind": "SpanKind.CLIENT",
290-
"parent_id": null,
291-
"start_time": "2025-09-25T11:00:48.663688Z",
292-
"end_time": "2025-09-25T11:00:57.271389Z",
293-
"status": {
294-
"status_code": "UNSET"
204+
"span_id": "0x2cad6fc139dcf01d"
295205
},
296206
"attributes": {
297207
"gen_ai.operation.name": "invoke_agent",
298-
"gen_ai.system": "openai",
299-
"gen_ai.agent.id": "Joker",
300208
"gen_ai.agent.name": "Joker",
301-
"gen_ai.request.instructions": "You are good at telling jokes.",
302-
"gen_ai.response.id": "chatcmpl-CH6fgKwMRGDtGNO3H88gA3AG2o7c5",
303209
"gen_ai.usage.input_tokens": 26,
304210
"gen_ai.usage.output_tokens": 29
305211
}
306212
}
307213
```
308214

309-
Followed by the text response from the agent:
215+
## Microsoft Foundry integration
310216

311-
```text
312-
Why did the pirate go to school?
217+
If you're using Microsoft Foundry, there's a convenient method that automatically configures Azure Monitor with Application Insights. First ensure your Foundry project has Azure Monitor configured (see [Monitor applications](/azure/ai-foundry/how-to/monitor-applications)).
313218

314-
Because he wanted to improve his "arrr-ticulation"! ⛵
219+
```bash
220+
pip install azure-monitor-opentelemetry
315221
```
316222

317-
## Understanding the telemetry output
318-
319-
Once observability is enabled, Agent Framework automatically creates the following spans:
320-
321-
- **`invoke_agent <agent_name>`**: The top-level span for each agent invocation. Contains all other spans as children and includes metadata like agent ID, name, and instructions.
322-
323-
- **`chat <model_name>`**: Created when the agent calls the underlying chat model. Includes the prompt and response as attributes when `enable_sensitive_data` is `True`, along with token usage information.
324-
325-
- **`execute_tool <function_name>`**: Created when the agent calls a function tool. Contains function arguments and results as attributes when `enable_sensitive_data` is `True`.
326-
327-
The following metrics are also collected:
328-
329-
**For chat operations:**
330-
- `gen_ai.client.operation.duration` (histogram): Duration of each operation in seconds
331-
- `gen_ai.client.token.usage` (histogram): Token usage in number of tokens
332-
333-
**For function invocations:**
334-
- `agent_framework.function.invocation.duration` (histogram): Duration of each function execution in seconds
223+
```python
224+
from agent_framework.azure import AzureAIClient
225+
from azure.ai.projects.aio import AIProjectClient
226+
from azure.identity.aio import AzureCliCredential
227+
228+
async with (
229+
AzureCliCredential() as credential,
230+
AIProjectClient(endpoint="https://<your-project>.foundry.azure.com", credential=credential) as project_client,
231+
AzureAIClient(project_client=project_client) as client,
232+
):
233+
# Automatically configures Azure Monitor with connection string from project
234+
await client.configure_azure_monitor(enable_live_metrics=True)
235+
```
335236

336-
## Azure AI Foundry integration
237+
### Custom agents with Foundry observability
337238

338-
If you're using Azure AI Foundry clients, there's a convenient method for automatic setup:
239+
For custom agents not created through Foundry, you can register them in the Foundry portal and use the same OpenTelemetry agent ID. See [Register custom agent](/azure/ai-foundry/control-plane/register-custom-agent) for setup instructions.
339240

340241
```python
341-
from agent_framework.azure import AzureAIAgentClient
342-
from azure.identity import AzureCliCredential
343-
344-
agent_client = AzureAIAgentClient(
345-
credential=AzureCliCredential(),
346-
# endpoint and model_deployment_name can be taken from environment variables
347-
# project_endpoint="https://<your-project>.foundry.azure.com"
348-
# model_deployment_name="<your-deployment-name>"
242+
from azure.monitor.opentelemetry import configure_azure_monitor
243+
from agent_framework import ChatAgent
244+
from agent_framework.observability import create_resource, enable_instrumentation
245+
from agent_framework.openai import OpenAIChatClient
246+
247+
# Configure Azure Monitor
248+
configure_azure_monitor(
249+
connection_string="InstrumentationKey=...",
250+
resource=create_resource(),
251+
enable_live_metrics=True,
349252
)
253+
# Optional if ENABLE_INSTRUMENTATION is already set in env vars
254+
enable_instrumentation()
350255

351-
# Automatically configures observability with Application Insights
352-
await agent_client.setup_azure_ai_observability()
353-
```
354-
355-
This method retrieves the Application Insights connection string from your Azure AI Foundry project and calls `setup_observability` automatically. If you want to use Foundry Telemetry with other types of agents, you can do the same thing with:
356-
```python
357-
from agent_framework.observability import setup_observability
358-
from azure.ai.projects import AIProjectClient
359-
from azure.identity import AzureCliCredential
360-
361-
project_client = AIProjectClient(endpoint, credential=AzureCliCredential())
362-
conn_string = project_client.telemetry.get_application_insights_connection_string()
363-
setup_observability(applicationinsights_connection_string=conn_string)
256+
# Create your agent with the same OpenTelemetry agent ID as registered in Foundry
257+
agent = ChatAgent(
258+
chat_client=OpenAIChatClient(),
259+
name="My Agent",
260+
instructions="You are a helpful assistant.",
261+
id="<OpenTelemetry agent ID>" # Must match the ID registered in Foundry
262+
)
263+
# Use the agent as normal
364264
```
365-
Also see the [relevant Foundry documentation](/azure/ai-foundry/how-to/develop/trace-agents-sdk).
366265

367-
> [!NOTE]
368-
> When using Azure Monitor for your telemetry, you need to install the `azure-monitor-opentelemetry` package explicitly, as it is not included by default with Agent Framework.
266+
> [!TIP]
267+
> For more detailed setup instructions, see the [Microsoft Foundry setup](../../user-guide/observability.md#microsoft-foundry-setup) section in the user guide.
369268
370269
## Next steps
371270

372-
For more advanced observability scenarios and examples, see the [Agent Observability user guide](../../user-guide/agents/agent-observability.md) and the [observability samples](https://github.com/microsoft/agent-framework/tree/main/python/samples/getting_started/observability) in the GitHub repository.
271+
For more advanced observability scenarios including custom exporters, third-party integrations (Langfuse, etc.), Aspire Dashboard setup, and detailed span/metric documentation, see the [Observability user guide](../../user-guide/observability.md).
373272

374273
> [!div class="nextstepaction"]
375274
> [Persisting conversations](./persisted-conversation.md)

agent-framework/user-guide/agents/TOC.yml

Lines changed: 6 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -2,17 +2,15 @@
22
href: agent-types/TOC.yml
33
- name: Running Agents
44
href: running-agents.md
5-
- name: Agent Tools
6-
href: agent-tools.md
75
- name: Multi-Turn Conversations and Threading
86
href: multi-turn-conversation.md
9-
- name: Agent Middleware
10-
href: agent-middleware.md
11-
- name: Agent Retrieval Augmented Generation (RAG)
12-
href: agent-rag.md
137
- name: Agent Chat History and Memory
148
href: agent-memory.md
15-
- name: Agent Observability
16-
href: agent-observability.md
9+
- name: Agent Tools
10+
href: agent-tools.md
11+
- name: Agent Retrieval Augmented Generation (RAG)
12+
href: agent-rag.md
13+
- name: Agent Middleware
14+
href: agent-middleware.md
1715
- name: Agent Background Responses
1816
href: agent-background-responses.md

0 commit comments

Comments
 (0)