Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ After your application starts sending data, the traces automatically appear in t

</div>

### Examples
## Examples

#### Using Strands Agents

Expand Down Expand Up @@ -210,6 +210,32 @@ LLM Observability supports spans that follow the OpenTelemetry 1.37+ semantic co

For the complete list of supported attributes and their specifications, see the [OpenTelemetry semantic conventions for generative AI documentation][1].

## Disabling LLM Observability conversion

If you'd only like your generative AI spans to remain in APM and not appear in LLM Observability, you can disable the automatic conversion by setting the `dd_llmobs_enabled` attribute to `false`. Setting this attribute on any span in a trace prevents the entire trace from being converted to LLM Observability.

### Using environment variables

Add the `dd_llmobs_enabled=false` attribute to your `OTEL_RESOURCE_ATTRIBUTES` environment variable:

```
OTEL_RESOURCE_ATTRIBUTES=dd_llmobs_enabled=false
```

### Using code

You can also set the attribute programmatically on any span in your trace:

```python
from opentelemetry import trace

tracer = trace.get_tracer(__name__)

with tracer.start_as_current_span("my-span") as span:
# Disable LLM Observability conversion for this entire trace
span.set_attribute("dd_llmobs_enabled", False)
```

[1]: https://opentelemetry.io/docs/specs/semconv/gen-ai/gen-ai-agent-spans/#spans
[2]: https://app.datadoghq.com/organization-settings/api-keys
[3]: https://app.datadoghq.com/llm/traces
Expand Down
6 changes: 6 additions & 0 deletions content/en/opentelemetry/compatibility.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ The following table shows feature compatibility across different setups:
| [Cloud SIEM][18] | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} |
| [Correlated Traces, Metrics, Logs][19] | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} |
| [Distributed Tracing][27] | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} |
| [LLM Observability][38] | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} |
| [Runtime Metrics][23] | {{< X >}} | {{< X >}}<br>(Java, .NET, Go only) | {{< X >}}<br>(Java, .NET, Go only) | {{< X >}}<br>(Java, .NET, Go only) |
| [Span Links][25] | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} |
| [Trace Metrics][26] | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}}<br>({{< tooltip text="Sampled" tooltip="Calculated from spans that reach Datadog; reflects any OTel-side sampling you configure." >}}) |
Expand Down Expand Up @@ -63,6 +64,10 @@ Datadog provides support for the OpenTelemetry Traces, Metrics, and Logs APIs ac

## More details

### LLM Observability

OpenTelemetry traces that have [generative AI attributes](https://opentelemetry.io/docs/specs/semconv/gen-ai/gen-ai-spans/) are automatically converted into LLM Observability traces. To disable this conversion, see [Disabling LLM Observability conversion][38].

### Runtime metrics

Setups using the OpenTelemetry SDK follow the [OpenTelemetry Runtime Metrics][1] specification.
Expand Down Expand Up @@ -137,3 +142,4 @@ When using Datadog and OpenTelemetry together, Datadog recommends the following
[35]: /opentelemetry/instrument/api_support/go/
[36]: /opentelemetry/instrument/api_support/ruby/
[37]: /opentelemetry/instrument/api_support/php/
[38]: /llm_observability/instrumentation/otel_instrumentation/
Loading