LlamaIndex

Arize has first-class support for LlamaIndex applications. After instrumentation, you will have a full trace of every part of your LLM application, including input, embeddings, retrieval, functions, and output messages.

We follow a standardized format for how a trace data should be structured using openinference, which is our open source package based on OpenTelemetry.

Use our code block below to get started using our LlamaIndexInstrumentor.

The following code snippet showcases how to automatically instrument your LLM application.

import os

# Import open-telemetry dependencies
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

# Import the automatic instrumentor from OpenInference
from openinference.instrumentation.llama_index import LlamaIndexInstrumentor

# Set the Space and API keys as headers for authentication
headers = f"space_key={ARIZE_SPACE_KEY},api_key={ARIZE_API_KEY}"
os.environ['OTEL_EXPORTER_OTLP_TRACES_HEADERS'] = headers

# Set resource attributes for the name and version for your application
resource = Resource(
    attributes={
        "model_id":"llamaindex-llm-tracing", # Set this to any name you'd like for your app
        "model_version":"1.0", # Set this to a version number string
    }
)

# Define the span processor as an exporter to the desired endpoint
endpoint = "https://otlp.arize.com/v1"
span_exporter = OTLPSpanExporter(endpoint=endpoint)
span_processor = SimpleSpanProcessor(span_exporter=span_exporter)

# Set the tracer provider
tracer_provider = trace_sdk.TracerProvider(resource=resource)
tracer_provider.add_span_processor(span_processor=span_processor)
trace_api.set_tracer_provider(tracer_provider=tracer_provider)

# Finish automatic instrumentation
LlamaIndexInstrumentor().instrument()

Now start asking questions to your LLM app and watch the traces being collected by Arize. For more in-detail demonstration, check our Colab tutorial:

Last updated

Copyright © 2023 Arize AI, Inc