LiteLLM

Arize has first-class support for instrumenting LiteLLM. Traces are fully OpenTelemetry compatible and can be sent to any OpenTelemetry collector for viewing.

Follow our colab guide
See the source code in our repo

In this example we will instrument generate_content calls.

pip install openinference-instrumentation-litellm arize-otel litellm

Set up LiteLLMInstrumentor to trace your application and sends the traces to Phoenix at the endpoint defined below.

# Import open-telemetry dependencies
from arize.otel import register

# Setup OTel via our convenience function
tracer_provider = register(
    space_id = "your-space-id", # in app space settings page
    api_key = "your-api-key", # in app space settings page
    project_name = "your-project-name", # name this to whatever you would like
)

from openinference.instrumentation.litellm import LiteLLMInstrumentor
LiteLLMInstrumentor().instrument(tracer_provider=tracer_provider)

Now, all calls by litellm are instrumented!

import litellm
litellm.completion(
    model="gpt-3.5-turbo", 
    messages=[{"content": "What's the capital of China?", "role": "user"}]
)

Last updated

Copyright ยฉ 2023 Arize AI, Inc