Langchain

Arize has first-class support for LangChain applications. After instrumentation, you will have a full trace of every part of your LLM application, including input, embeddings, retrieval, functions, and output messages.

We follow a standardized format for how a trace data should be structured using openinference, which is our open source package based on OpenTelemetry.

Use our code block below to get started using our LangChainInstrumentor.

import os

# Import open-telemetry dependencies
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

# Import the automatic instrumentor from OpenInference
from openinference.instrumentation.langchain import LangChainInstrumentor

# Set the Space and API keys as headers for authentication
headers = f"space_key={ARIZE_SPACE_KEY},api_key={ARIZE_API_KEY}"
os.environ['OTEL_EXPORTER_OTLP_TRACES_HEADERS'] = headers

# Set resource attributes for the name and version for your application
resource = Resource(
    attributes={
        "model_id":"langchain-llm-tracing", # Set this to any name you'd like for your app
        "model_version":"1.0", # Set this to a version number string
    }
)

# Define the span processor as an exporter to the desired endpoint
endpoint = "https://otlp.arize.com/v1"
span_exporter = OTLPSpanExporter(endpoint=endpoint)
span_processor = SimpleSpanProcessor(span_exporter=span_exporter)

# Set the tracer provider
tracer_provider = trace_sdk.TracerProvider(resource=resource)
tracer_provider.add_span_processor(span_processor=span_processor)
trace_api.set_tracer_provider(tracer_provider=tracer_provider)

# Finish automatic instrumentation
LangChainInstrumentor().instrument()

For more in-detail demonstration, check our Colab tutorial:

Last updated

Copyright ยฉ 2023 Arize AI, Inc