diff --git a/content/en/llm_observability/instrumentation/otel_instrumentation.md b/content/en/llm_observability/instrumentation/otel_instrumentation.md
index 1ee5dca056e..95670731ea0 100644
--- a/content/en/llm_observability/instrumentation/otel_instrumentation.md
+++ b/content/en/llm_observability/instrumentation/otel_instrumentation.md
@@ -66,7 +66,7 @@ After your application starts sending data, the traces automatically appear in t
-### Examples
+## Examples
#### Using Strands Agents
@@ -210,6 +210,32 @@ LLM Observability supports spans that follow the OpenTelemetry 1.37+ semantic co
For the complete list of supported attributes and their specifications, see the [OpenTelemetry semantic conventions for generative AI documentation][1].
+## Disabling LLM Observability conversion
+
+If you'd only like your generative AI spans to remain in APM and not appear in LLM Observability, you can disable the automatic conversion by setting the `dd_llmobs_enabled` attribute to `false`. Setting this attribute on any span in a trace prevents the entire trace from being converted to LLM Observability.
+
+### Using environment variables
+
+Add the `dd_llmobs_enabled=false` attribute to your `OTEL_RESOURCE_ATTRIBUTES` environment variable:
+
+```
+OTEL_RESOURCE_ATTRIBUTES=dd_llmobs_enabled=false
+```
+
+### Using code
+
+You can also set the attribute programmatically on any span in your trace:
+
+```python
+from opentelemetry import trace
+
+tracer = trace.get_tracer(__name__)
+
+with tracer.start_as_current_span("my-span") as span:
+ # Disable LLM Observability conversion for this entire trace
+ span.set_attribute("dd_llmobs_enabled", False)
+```
+
[1]: https://opentelemetry.io/docs/specs/semconv/gen-ai/gen-ai-agent-spans/#spans
[2]: https://app.datadoghq.com/organization-settings/api-keys
[3]: https://app.datadoghq.com/llm/traces
diff --git a/content/en/opentelemetry/compatibility.md b/content/en/opentelemetry/compatibility.md
index 6f8e4b93428..07ff4451213 100644
--- a/content/en/opentelemetry/compatibility.md
+++ b/content/en/opentelemetry/compatibility.md
@@ -31,6 +31,7 @@ The following table shows feature compatibility across different setups:
| [Cloud SIEM][18] | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} |
| [Correlated Traces, Metrics, Logs][19] | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} |
| [Distributed Tracing][27] | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} |
+| [LLM Observability][38] | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} |
| [Runtime Metrics][23] | {{< X >}} | {{< X >}}
(Java, .NET, Go only) | {{< X >}}
(Java, .NET, Go only) | {{< X >}}
(Java, .NET, Go only) |
| [Span Links][25] | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}} |
| [Trace Metrics][26] | {{< X >}} | {{< X >}} | {{< X >}} | {{< X >}}
({{< tooltip text="Sampled" tooltip="Calculated from spans that reach Datadog; reflects any OTel-side sampling you configure." >}}) |
@@ -63,6 +64,10 @@ Datadog provides support for the OpenTelemetry Traces, Metrics, and Logs APIs ac
## More details
+### LLM Observability
+
+OpenTelemetry traces that have [generative AI attributes](https://opentelemetry.io/docs/specs/semconv/gen-ai/gen-ai-spans/) are automatically converted into LLM Observability traces. To disable this conversion, see [Disabling LLM Observability conversion][38].
+
### Runtime metrics
Setups using the OpenTelemetry SDK follow the [OpenTelemetry Runtime Metrics][1] specification.
@@ -137,3 +142,4 @@ When using Datadog and OpenTelemetry together, Datadog recommends the following
[35]: /opentelemetry/instrument/api_support/go/
[36]: /opentelemetry/instrument/api_support/ruby/
[37]: /opentelemetry/instrument/api_support/php/
+[38]: /llm_observability/instrumentation/otel_instrumentation/