Arize has first-class support for instrumenting LiteLLM. Traces are fully OpenTelemetry compatible and can be sent to any OpenTelemetry collector for viewing.
In this example we will instrument generate_content calls.
Set up LiteLLMInstrumentor to trace your application and sends the traces to Phoenix at the endpoint defined below.
# Import open-telemetry dependenciesfrom arize.otel import register# Setup OTel via our convenience functiontracer_provider =register( space_id ="your-space-id", # in app space settings page api_key ="your-api-key", # in app space settings page project_name ="your-project-name", # name this to whatever you would like)from openinference.instrumentation.litellm import LiteLLMInstrumentorLiteLLMInstrumentor().instrument(tracer_provider=tracer_provider)
Now, all calls by litellm are instrumented!
import litellmlitellm.completion( model="gpt-3.5-turbo", messages=[{"content": "What's the capital of China?", "role": "user"}])