LangChain
Forward LangChain application logs to Sazabi for LLM observability.
Forward your LangChain application logs directly to Sazabi for comprehensive observability of your LLM-powered applications. Monitor chains, agents, tool calls, and model interactions.
About this data source
LangChain is a framework for building applications with large language models. By configuring OpenTelemetry instrumentation and exporting to Sazabi, you can:
- Trace chain executions through multiple steps
- Monitor agent decisions and tool calls
- Track LLM requests with latency and token metrics
- Detect prompt issues and model errors
- Use AI to analyze and optimize your LLM workflows
Prerequisites
Before you begin, make sure you have:
- A LangChain application (Python, JavaScript, or other supported language)
- OpenTelemetry SDK for your language
- A Sazabi public API key (project-scoped)
Get your API key
Create a public key
Click Create API key and select Public as the key type. Public keys are scoped to a single project.
Copy the key and store it securely. You will not be able to see it again.
Setup
Install OpenTelemetry
Add OpenTelemetry instrumentation to your LangChain application.
pip install opentelemetry-api opentelemetry-sdk opentelemetry-exporter-otlpnpm install @opentelemetry/api @opentelemetry/sdk-node @opentelemetry/exporter-trace-otlp-httpSet environment variables
Configure the standard OTEL environment variables:
OTEL_EXPORTER_OTLP_ENDPOINT=https://otlp.<region>.intake.sazabi.com
OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer <your-api-key>"
OTEL_SERVICE_NAME=my-langchain-appReplace <region> with your Sazabi project region (e.g., us-east-1) and
<your-api-key> with your Sazabi public API key.
Initialize telemetry
Initialize OpenTelemetry in your application before using LangChain.
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
# Configure OTLP exporter (uses env vars automatically)
exporter = OTLPSpanExporter()
provider = TracerProvider()
processor = BatchSpanProcessor(exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)const { NodeSDK } = require("@opentelemetry/sdk-node");
const { OTLPTraceExporter } = require("@opentelemetry/exporter-trace-otlp-http");
const sdk = new NodeSDK({
traceExporter: new OTLPTraceExporter(),
// Uses env vars for configuration
});
sdk.start();Deploy your application
Deploy your LangChain application with the new configuration. Traces will begin flowing to Sazabi when chains execute.
What gets captured
LangChain with OpenTelemetry captures:
- Chain executions: Start, completion, and errors for each chain
- Agent traces: Agent decisions, tool selections, and reasoning steps
- LLM calls: Model requests, responses, token counts, and latencies
- Tool invocations: Tool calls with inputs and outputs
- Retriever queries: Document retrieval operations
Features
- Real-time tracing of chains, agents, and tool calls via OpenTelemetry
- LLM call monitoring with token usage and latency metrics
- Compatible with LangGraph and other LangChain integrations
- Works with any language that has an OpenTelemetry SDK
Verifying traces are flowing
Once configured, verify that traces are flowing to Sazabi:
-
Run a chain: Execute one of your LangChain chains or agents.
-
Ask the assistant: Open a thread in Sazabi and ask "Show me recent LangChain traces" or "What LLM calls were made in the last hour?"
Troubleshooting
Traces not appearing in Sazabi
- Verify environment variables are set correctly
- Check that OpenTelemetry is initialized before LangChain
- Ensure the endpoint URL includes the correct region
- Run a chain to generate traces
401 Unauthorized errors
- Your API key may be invalid or expired
- Verify the OTEL_EXPORTER_OTLP_HEADERS format
- Create a new public API key in Settings > API Keys
Missing chain or agent data
Make sure your LangChain version supports OpenTelemetry instrumentation. Some older versions may require additional configuration.