Arize has first-class support for instrumenting LiteLLM. Traces are fully OpenTelemetry compatible and can be sent to any OpenTelemetry collector for viewing.
In this example we will instrument generate_content calls.
Set up LiteLLMInstrumentor to trace your application and sends the traces to Phoenix at the endpoint defined below.
from openinference.instrumentation.litellm import LiteLLMInstrumentor# Import open-telemetry dependenciesfrom arize_otel import register_otel, Endpoints# Setup OTEL via our convenience functionregister_otel( endpoints = Endpoints.ARIZE, space_id ="your-space-id", # in app space settings page api_key ="your-api-key", # in app space settings page project_name ="your-project-name", # name this to whatever you would like)LiteLLMInstrumentor().instrument()
Now, all calls by litellm are instrumented!
import litellmlitellm.completion( model="gpt-3.5-turbo", messages=[{"content": "What's the capital of China?", "role": "user"}])