You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: agent-framework/user-guide/agents/agent-observability.md
+28-7Lines changed: 28 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -167,13 +167,11 @@ See a full example of an agent with OpenTelemetry enabled in the [Agent Framewor
167
167
168
168
## Enable Observability
169
169
170
-
To enable observability in your python application, you do not need to install anything extra, by default the following package are installed:
170
+
To enable observability in your python application, in most cases you do not need to install anything extra, by default the following package are installed:
171
171
172
172
```text
173
173
"opentelemetry-api",
174
174
"opentelemetry-sdk",
175
-
"azure-monitor-opentelemetry",
176
-
"azure-monitor-opentelemetry-exporter",
177
175
"opentelemetry-exporter-otlp-proto-grpc",
178
176
"opentelemetry-semantic-conventions-ai",
179
177
```
@@ -220,7 +218,7 @@ The easiest way to enable observability for your application is to set the follo
220
218
This can be used for any compliant OTLP endpoint, such as [OpenTelemetry Collector](https://opentelemetry.io/docs/collector/), [Aspire Dashboard](/dotnet/aspire/fundamentals/dashboard/overview?tabs=bash) or any other OTLP compliant endpoint.
221
219
- APPLICATIONINSIGHTS_CONNECTION_STRING
222
220
Default is `None`, set to your Application Insights connection string to export to Azure Monitor.
223
-
You can find the connection string in the Azure portal, in the "Overview" section of your Application Insights resource.
221
+
You can find the connection string in the Azure portal, in the "Overview" section of your Application Insights resource. This will require the `azure-monitor-opentelemetry-exporter` package to be installed.
224
222
- VS_CODE_EXTENSION_PORT
225
223
Default is `4317`, set to the port the AI Toolkit or AzureAI Foundry VS Code extension is running on.
226
224
@@ -260,6 +258,14 @@ Azure AI Foundry has built-in support for tracing, with a really great visualiza
260
258
261
259
When you have a Azure AI Foundry project setup with a Application Insights resource, you can do the following:
262
260
261
+
1) Install the `azure-monitor-opentelemetry-exporter` package:
This is a convenience method, that will use the project client, to get the Application Insights connection string, and then call `setup_observability` with that connection string.
278
+
This is a convenience method, that will use the project client, to get the Application Insights connection string, and then call `setup_observability` with that connection string, overriding any existing connection string set via environment variable.
279
+
280
+
### Zero-code instrumentation
281
+
282
+
Because we use the standard OpenTelemetry SDK, you can also use zero-code instrumentation to instrument your application, run you code like this:
283
+
284
+
```bash
285
+
opentelemetry-instrument \
286
+
--traces_exporter console,otlp \
287
+
--metrics_exporter console \
288
+
--service_name your-service-name \
289
+
--exporter_otlp_endpoint 0.0.0.0:4317 \
290
+
python agent_framework_app.py
291
+
```
292
+
293
+
See the [OpenTelemetry Zero-code Python documentation](https://opentelemetry.io/docs/zero-code/python/) for more information and details of the environment variables used.
273
294
274
295
## Spans and metrics
275
296
@@ -288,7 +309,7 @@ The metrics that are created are:
288
309
- For function invocation during the `execute_tool` operations:
289
310
-`agent_framework.function.invocation.duration` (histogram): This metric measures the duration of each function execution, in seconds.
290
311
291
-
## Example trace output
312
+
###Example trace output
292
313
293
314
When you run an agent with observability enabled, you'll see trace data similar to the following console output:
294
315
@@ -328,7 +349,7 @@ This trace shows:
328
349
-**Model information**: The AI system used (OpenAI) and response ID
329
350
-**Token usage**: Input and output token counts for cost tracking
330
351
331
-
## Getting started
352
+
## Samples
332
353
333
354
We have a number of samples in our repository that demonstrate these capabilities, see the [observability samples folder](https://github.com/microsoft/agent-framework/tree/main/python/samples/getting_started/observability) on Github. That includes samples for using zero-code telemetry as well.
0 commit comments