Skip to content

Commit b221a9d

Browse files
update observability docs
1 parent ad45e83 commit b221a9d

1 file changed

Lines changed: 28 additions & 7 deletions

File tree

agent-framework/user-guide/agents/agent-observability.md

Lines changed: 28 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -167,13 +167,11 @@ See a full example of an agent with OpenTelemetry enabled in the [Agent Framewor
167167

168168
## Enable Observability
169169

170-
To enable observability in your python application, you do not need to install anything extra, by default the following package are installed:
170+
To enable observability in your python application, in most cases you do not need to install anything extra, by default the following package are installed:
171171

172172
```text
173173
"opentelemetry-api",
174174
"opentelemetry-sdk",
175-
"azure-monitor-opentelemetry",
176-
"azure-monitor-opentelemetry-exporter",
177175
"opentelemetry-exporter-otlp-proto-grpc",
178176
"opentelemetry-semantic-conventions-ai",
179177
```
@@ -220,7 +218,7 @@ The easiest way to enable observability for your application is to set the follo
220218
This can be used for any compliant OTLP endpoint, such as [OpenTelemetry Collector](https://opentelemetry.io/docs/collector/), [Aspire Dashboard](/dotnet/aspire/fundamentals/dashboard/overview?tabs=bash) or any other OTLP compliant endpoint.
221219
- APPLICATIONINSIGHTS_CONNECTION_STRING
222220
Default is `None`, set to your Application Insights connection string to export to Azure Monitor.
223-
You can find the connection string in the Azure portal, in the "Overview" section of your Application Insights resource.
221+
You can find the connection string in the Azure portal, in the "Overview" section of your Application Insights resource. This will require the `azure-monitor-opentelemetry-exporter` package to be installed.
224222
- VS_CODE_EXTENSION_PORT
225223
Default is `4317`, set to the port the AI Toolkit or AzureAI Foundry VS Code extension is running on.
226224

@@ -260,6 +258,14 @@ Azure AI Foundry has built-in support for tracing, with a really great visualiza
260258

261259
When you have a Azure AI Foundry project setup with a Application Insights resource, you can do the following:
262260

261+
1) Install the `azure-monitor-opentelemetry-exporter` package:
262+
263+
```bash
264+
pip install azure-monitor-opentelemetry-exporter>=1.0.0b41
265+
```
266+
267+
2) Then you can setup observability for your Azure AI Foundry project as follows:
268+
263269
```python
264270
from agent_framework.azure import AzureAIAgentClient
265271
from azure.identity import AzureCliCredential
@@ -269,7 +275,22 @@ agent_client = AzureAIAgentClient(credential=AzureCliCredential(), project_endpo
269275
await agent_client.setup_azure_ai_observability()
270276
```
271277

272-
This is a convenience method, that will use the project client, to get the Application Insights connection string, and then call `setup_observability` with that connection string.
278+
This is a convenience method, that will use the project client, to get the Application Insights connection string, and then call `setup_observability` with that connection string, overriding any existing connection string set via environment variable.
279+
280+
### Zero-code instrumentation
281+
282+
Because we use the standard OpenTelemetry SDK, you can also use zero-code instrumentation to instrument your application, run you code like this:
283+
284+
```bash
285+
opentelemetry-instrument \
286+
--traces_exporter console,otlp \
287+
--metrics_exporter console \
288+
--service_name your-service-name \
289+
--exporter_otlp_endpoint 0.0.0.0:4317 \
290+
python agent_framework_app.py
291+
```
292+
293+
See the [OpenTelemetry Zero-code Python documentation](https://opentelemetry.io/docs/zero-code/python/) for more information and details of the environment variables used.
273294

274295
## Spans and metrics
275296

@@ -288,7 +309,7 @@ The metrics that are created are:
288309
- For function invocation during the `execute_tool` operations:
289310
- `agent_framework.function.invocation.duration` (histogram): This metric measures the duration of each function execution, in seconds.
290311

291-
## Example trace output
312+
### Example trace output
292313

293314
When you run an agent with observability enabled, you'll see trace data similar to the following console output:
294315

@@ -328,7 +349,7 @@ This trace shows:
328349
- **Model information**: The AI system used (OpenAI) and response ID
329350
- **Token usage**: Input and output token counts for cost tracking
330351

331-
## Getting started
352+
## Samples
332353

333354
We have a number of samples in our repository that demonstrate these capabilities, see the [observability samples folder](https://github.com/microsoft/agent-framework/tree/main/python/samples/getting_started/observability) on Github. That includes samples for using zero-code telemetry as well.
334355

0 commit comments

Comments
 (0)