LiteLLM

Arize has first-class support for instrumenting LiteLLM. Traces are fully OpenTelemetry compatible and can be sent to any OpenTelemetry collector for viewing.

In this example we will instrument generate_content calls.

pip install openinference-instrumentation-litellm arize-otel litellm

Set up LiteLLMInstrumentor to trace your application and sends the traces to Phoenix at the endpoint defined below.

from openinference.instrumentation.litellm import LiteLLMInstrumentor
# Import open-telemetry dependencies
from arize_otel import register_otel, Endpoints

# Setup OTEL via our convenience function
register_otel(
    endpoints = Endpoints.ARIZE,
    space_id = "your-space-id", # in app space settings page
    api_key = "your-api-key", # in app space settings page
    model_id = "your-model-id", # name this to whatever you would like
)

LiteLLMInstrumentor().instrument()

Now, all calls by litellm are instrumented!

import litellm
litellm.completion(
    model="gpt-3.5-turbo", 
    messages=[{"content": "What's the capital of China?", "role": "user"}]
)

Last updated

Copyright ยฉ 2023 Arize AI, Inc