Observability

djangosdk supports three observability backends: LangSmith, Langfuse, and OpenTelemetry.

Configuration

AI_SDK = {
    "OBSERVABILITY": {
        "BACKEND": "langsmith",  # "langsmith" | "langfuse" | "opentelemetry" | None
    },
}

LangSmith

LangSmitharrow-up-right provides tracing, evaluation, and monitoring for AI applications.

uv add "djangosdk[langsmith]"
import os
os.environ["LANGCHAIN_API_KEY"] = "ls__..."
os.environ["LANGCHAIN_TRACING_V2"] = "true"

AI_SDK = {
    "OBSERVABILITY": {"BACKEND": "langsmith"},
}

Every agent.handle() call is automatically traced as a LangSmith run.

Langfuse

Langfusearrow-up-right is an open-source LLM observability platform.

OpenTelemetry

For custom observability pipelines, use the OpenTelemetry backend:

Traces are emitted to the configured OpenTelemetry exporter (OTLP, Jaeger, Zipkin, etc.).

What Gets Traced

Event
Data captured

Agent start

agent_class, model, provider, prompt

Agent complete

response_text, token_usage, latency_ms

Tool call

tool_name, arguments, result

Cache hit/miss

cache_read_tokens

Provider failover

from_provider, to_provider

Cost Tracking

The analytics module tracks token costs per provider:

Last updated

Was this helpful?