Langchain
Arize has first-class support for LangChain applications. After instrumentation, you will have a full trace of every part of your LLM application, including input, embeddings, retrieval, functions, and output messages.
We follow a standardized format for how a trace data should be structured using openinference, which is our open source package based on OpenTelemetry.
Use our code block below to get started using our LangChainInstrumentor.
# Import open-telemetry dependencies
from arize.otel import register
# Setup OTel via our convenience function
tracer_provider = register(
space_id = "your-space-id", # in app space settings page
api_key = "your-api-key", # in app space settings page
project_name = "your-project-name", # name this to whatever you would like
)
# Import the automatic instrumentor from OpenInference
from openinference.instrumentation.langchain import LangChainInstrumentor
# Finish automatic instrumentation
LangChainInstrumentor().instrument(tracer_provider=tracer_provider)
For more in-detail demonstration, check our Colab tutorial:
The LangChain instrumentation package via npm.
npm install @arizeai/openinference-instrumentation-langchain
The example below utilizes the OpenInference JavaScript LangChain example.
Navigate to the backend
folder.
In addition to the above package, sending traces to Arize requires the following package: @opentelemetry/exporter-trace-otlp-grpc
. This package can be installed in your environment by running the following command in your shell.
npm install @opentelemetry/exporter-trace-otlp-grpc @grpc/grpc-js
instrumentation.ts
should be implemented as below (you'll need to install all of the packages imported below in the same manner as above):
/*instrumentation.ts */
import { LangChainInstrumentation } from "@arizeai/openinference-instrumentation-langchain";
import { ConsoleSpanExporter } from "@opentelemetry/sdk-trace-base";
import {
NodeTracerProvider,
SimpleSpanProcessor,
} from "@opentelemetry/sdk-trace-node";
import { Resource } from "@opentelemetry/resources";
import { OTLPTraceExporter as GrpcOTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-grpc"; // Arize specific
import { diag, DiagConsoleLogger, DiagLogLevel } from "@opentelemetry/api";
import { Metadata } from "@grpc/grpc-js";
import * as CallbackManagerModule from "@langchain/core/callbacks/manager";
// For troubleshooting, set the log level to DiagLogLevel.DEBUG
diag.setLogger(new DiagConsoleLogger(), DiagLogLevel.DEBUG);
// Arize specific - Create metadata and add your headers
const metadata = new Metadata();
// Your Arize Space and API Keys, which can be found in the UI
metadata.set('space_id', 'your-space-id');
metadata.set('api_key', 'your-api-key');
const provider = new NodeTracerProvider({
resource: new Resource({
// Arize specific - The name of a new or preexisting model you
// want to export spans to
"model_id": "your-model-id",
"model_version": "your-model-version"
}),
});
provider.addSpanProcessor(new SimpleSpanProcessor(new ConsoleSpanExporter()));
provider.addSpanProcessor(
new SimpleSpanProcessor(
new GrpcOTLPTraceExporter({
url: "https://otlp.arize.com/v1",
metadata,
}),
),
);
const lcInstrumentation = new LangChainInstrumentation();
// LangChain must be manually instrumented as it doesn't have
// a traditional module structure
lcInstrumentation.manuallyInstrument(CallbackManagerModule);
provider.register();
If you simultaneously want to send spans to a Phoenix collector, you should also add the following code blocks, from the original instrumentation.ts
files.
import {
OTLPTraceExporter as ProtoOTLPTraceExporter
} from "@opentelemetry/exporter-trace-otlp-proto";
// add as another SpanProcessor below the previous SpanProcessor
provider.addSpanProcessor(
new SimpleSpanProcessor(
new ProtoOTLPTraceExporter({
// This is the url where your phoenix server is running
url: "http://localhost:6006/v1/traces",
}),
),
);
Follow the steps from the backend
and frontend
readme. Or simply run:
docker compose up --build
to build run the frontend, backend, and Phoenix all at the same time. Navigate to localhost:3000 to begin sending messages to the chatbot and check out your traces in Arize at app.arize.com or Phoenix at localhost:6006.
Last updated