You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -131,142 +131,60 @@ Because he wanted to improve his "arrr-ticulation"! ?????
131
131
::: zone-end
132
132
::: zone pivot="programming-language-python"
133
133
134
-
This tutorial shows how to enable OpenTelemetry on an agent so that interactions with the agent are automatically logged and exported.
135
-
In this tutorial, output is written to the console using the OpenTelemetry console exporter.
134
+
This tutorial shows how to quickly enable OpenTelemetry on an agent so that interactions with the agent are automatically logged and exported.
135
+
136
+
For comprehensive documentation on observability including all configuration options, environment variables, and advanced scenarios, see the [Observability user guide](../../user-guide/observability.md).
136
137
137
138
## Prerequisites
138
139
139
140
For prerequisites, see the [Create and run a simple agent](./run-agent.md) step in this tutorial.
140
141
141
142
## Install packages
142
143
143
-
To use Agent Framework with Azure OpenAI, you need to install the following packages. Agent Framework automatically includes all necessary OpenTelemetry dependencies:
144
+
To use Agent Framework with OpenTelemetry, install the framework:
144
145
145
146
```bash
146
147
pip install agent-framework --pre
147
148
```
148
149
149
-
The following OpenTelemetry packages are included by default:
150
-
```text
151
-
opentelemetry-api
152
-
opentelemetry-sdk
153
-
opentelemetry-exporter-otlp-proto-grpc
154
-
opentelemetry-semantic-conventions-ai
155
-
```
156
-
157
-
If you want to export to Azure Monitor (Application Insights), you also need to install the `azure-monitor-opentelemetry` package:
158
-
159
-
```bash
160
-
pip install azure-monitor-opentelemetry
161
-
```
150
+
For console output during development, no additional packages are needed. For other exporters, see the [Dependencies section](../../user-guide/observability.md#dependencies) in the user guide.
162
151
163
152
## Enable OpenTelemetry in your app
164
153
165
-
Agent Framework provides a convenient `setup_observability` function that configures OpenTelemetry with sensible defaults.
166
-
By default, it exports to the console if no specific exporter is configured.
154
+
The simplest way to enable observability is using `configure_otel_providers()`:
167
155
168
156
```python
169
-
import asyncio
170
-
from agent_framework.observability import setup_observability
157
+
from agent_framework.observability import configure_otel_providers
171
158
172
-
# Enable Agent Framework telemetry with console output (default behavior)
The `setup_observability` function accepts the following parameters to customize your observability configuration:
179
-
180
-
-**`enable_otel`** (bool, optional): Enables OpenTelemetry tracing and metrics. Default is `False` when using environment variables only, but is assumed `True` when calling `setup_observability()` programmatically. When using environment variables, set `ENABLE_OTEL=true`.
181
-
182
-
-**`enable_sensitive_data`** (bool, optional): Controls whether sensitive data like prompts, responses, function call arguments, and results are included in traces. Default is `False`. Set to `True` to see actual prompts and responses in your traces. **Warning**: Be careful with this setting as it might expose sensitive data in your logs. Can also be set via `ENABLE_SENSITIVE_DATA=true` environment variable.
183
-
184
-
-**`otlp_endpoint`** (str, optional): The OTLP endpoint URL for exporting telemetry data. Default is `None`. Commonly set to `http://localhost:4317`. This creates an OTLPExporter for spans, metrics, and logs. Can be used with any OTLP-compliant endpoint such as [OpenTelemetry Collector](https://opentelemetry.io/docs/collector/), [Aspire Dashboard](/dotnet/aspire/fundamentals/dashboard/overview?tabs=bash), or other OTLP endpoints. Can also be set via `OTLP_ENDPOINT` environment variable.
185
-
186
-
-**`applicationinsights_connection_string`** (str, optional): Azure Application Insights connection string for exporting to Azure Monitor. Default is `None`. Creates AzureMonitorTraceExporter, AzureMonitorMetricExporter, and AzureMonitorLogExporter. You can find this connection string in the Azure portal under the "Overview" section of your Application Insights resource. Can also be set via `APPLICATIONINSIGHTS_CONNECTION_STRING` environment variable. Requires installation of the `azure-monitor-opentelemetry` package.
163
+
Or use environment variables for more flexibility:
187
164
188
-
-**`vs_code_extension_port`** (int, optional): Port number for the AI Toolkit or Azure AI Foundry VS Code extension. Default is `4317`. Allows integration with VS Code extensions for local development and debugging. Can also be set via `VS_CODE_EXTENSION_PORT` environment variable.
189
-
190
-
-**`exporters`** (list, optional): Custom list of OpenTelemetry exporters for advanced scenarios. Default is `None`. Allows you to provide your own configured exporters when the standard options don't meet your needs.
191
-
192
-
> [!IMPORTANT]
193
-
> When no exporters (either through parameters or environment variables or as explicit exporters) are provided, the console exporter is configured by default for local debugging.
If you're using Microsoft Foundry, there's a convenient method that automatically configures Azure Monitor with Application Insights. First ensure your Foundry project has Azure Monitor configured (see [Monitor applications](/azure/ai-foundry/how-to/monitor-applications)).
313
218
314
-
Because he wanted to improve his "arrr-ticulation"! ⛵
219
+
```bash
220
+
pip install azure-monitor-opentelemetry
315
221
```
316
222
317
-
## Understanding the telemetry output
318
-
319
-
Once observability is enabled, Agent Framework automatically creates the following spans:
320
-
321
-
-**`invoke_agent <agent_name>`**: The top-level span for each agent invocation. Contains all other spans as children and includes metadata like agent ID, name, and instructions.
322
-
323
-
-**`chat <model_name>`**: Created when the agent calls the underlying chat model. Includes the prompt and response as attributes when `enable_sensitive_data` is `True`, along with token usage information.
324
-
325
-
-**`execute_tool <function_name>`**: Created when the agent calls a function tool. Contains function arguments and results as attributes when `enable_sensitive_data` is `True`.
326
-
327
-
The following metrics are also collected:
328
-
329
-
**For chat operations:**
330
-
-`gen_ai.client.operation.duration` (histogram): Duration of each operation in seconds
331
-
-`gen_ai.client.token.usage` (histogram): Token usage in number of tokens
332
-
333
-
**For function invocations:**
334
-
-`agent_framework.function.invocation.duration` (histogram): Duration of each function execution in seconds
223
+
```python
224
+
from agent_framework.azure import AzureAIClient
225
+
from azure.ai.projects.aio import AIProjectClient
226
+
from azure.identity.aio import AzureCliCredential
227
+
228
+
asyncwith (
229
+
AzureCliCredential() as credential,
230
+
AIProjectClient(endpoint="https://<your-project>.foundry.azure.com", credential=credential) as project_client,
231
+
AzureAIClient(project_client=project_client) as client,
232
+
):
233
+
# Automatically configures Azure Monitor with connection string from project
If you're using Azure AI Foundry clients, there's a convenient method for automatic setup:
239
+
For custom agents not created through Foundry, you can register them in the Foundry portal and use the same OpenTelemetry agent ID. See [Register custom agent](/azure/ai-foundry/control-plane/register-custom-agent) for setup instructions.
339
240
340
241
```python
341
-
from agent_framework.azure import AzureAIAgentClient
342
-
from azure.identity import AzureCliCredential
343
-
344
-
agent_client = AzureAIAgentClient(
345
-
credential=AzureCliCredential(),
346
-
# endpoint and model_deployment_name can be taken from environment variables
from azure.monitor.opentelemetry import configure_azure_monitor
243
+
from agent_framework import ChatAgent
244
+
from agent_framework.observability import create_resource, enable_instrumentation
245
+
from agent_framework.openai import OpenAIChatClient
246
+
247
+
# Configure Azure Monitor
248
+
configure_azure_monitor(
249
+
connection_string="InstrumentationKey=...",
250
+
resource=create_resource(),
251
+
enable_live_metrics=True,
349
252
)
253
+
# Optional if ENABLE_INSTRUMENTATION is already set in env vars
254
+
enable_instrumentation()
350
255
351
-
# Automatically configures observability with Application Insights
352
-
await agent_client.setup_azure_ai_observability()
353
-
```
354
-
355
-
This method retrieves the Application Insights connection string from your Azure AI Foundry project and calls `setup_observability` automatically. If you want to use Foundry Telemetry with other types of agents, you can do the same thing with:
356
-
```python
357
-
from agent_framework.observability import setup_observability
# Create your agent with the same OpenTelemetry agent ID as registered in Foundry
257
+
agent = ChatAgent(
258
+
chat_client=OpenAIChatClient(),
259
+
name="My Agent",
260
+
instructions="You are a helpful assistant.",
261
+
id="<OpenTelemetry agent ID>"# Must match the ID registered in Foundry
262
+
)
263
+
# Use the agent as normal
364
264
```
365
-
Also see the [relevant Foundry documentation](/azure/ai-foundry/how-to/develop/trace-agents-sdk).
366
265
367
-
> [!NOTE]
368
-
> When using Azure Monitor for your telemetry, you need to install the `azure-monitor-opentelemetry` package explicitly, as it is not included by default with Agent Framework.
266
+
> [!TIP]
267
+
> For more detailed setup instructions, see the [Microsoft Foundry setup](../../user-guide/observability.md#microsoft-foundry-setup) section in the user guide.
369
268
370
269
## Next steps
371
270
372
-
For more advanced observability scenarios and examples, see the [Agent Observability user guide](../../user-guide/agents/agent-observability.md) and the [observability samples](https://github.com/microsoft/agent-framework/tree/main/python/samples/getting_started/observability) in the GitHub repository.
271
+
For more advanced observability scenarios including custom exporters, third-party integrations (Langfuse, etc.), Aspire Dashboard setup, and detailed span/metric documentation, see the [Observability user guide](../../user-guide/observability.md).
0 commit comments