Tracing is a powerful tool for understanding the behavior of your LLM application. Phoenix has best-in-class tracing, regardless of what framework you use, and has first-class instrumentation for a variety of frameworks (LlamaIndex, LangChain, DSPy), SDKs (OpenAI, Bedrock, Mistral, Vertex), and Languages (Python, Javascript). You can also manually instrument your application using the OpenTelemetry SDK.
This example will walk you through how to use Phoenix to trace OpenAI requests.
Install & Launch Phoenix
Let's start by installing Phoenix. You have a few options for how to do this:
The easiest way to use Phoenix is by accessing a free persistent instance provided on our site. Sign up for an Arize Phoenix account at https://app.phoenix.arize.com/login
Once you're there, grab your API key from the Keys option on the left bar:
If you'd rather run Phoenix locally, you can instead use one of our self-hosting options. For more detail on each of these, see Self-hosting
Using Terminal
Install the Phoenix package:
pipinstallarize-phoenix
Launch the Phoenix client:
phoenixserve
This will expose the Phoenix UI and REST API on localhost:6006 and exposes the gRPC endpoint for spans on localhost:4317
Using Docker
Phoenix server images are available via Docker Hub and can be used via docker compose or if you simply want a long-running phoenix instance to share with your team.
dockerpullarizephoenix/phoenix:latest
Launch the phoenix docker image using:
docker run -p 6006:6006 -p 4317:4317 arizephoenix/phoenix:latest
This will expose the Phoenix UI and REST API on localhost:6006 and exposes the gRPC endpoint for spans on localhost:4317
As a final option, you can run a temporary version of Phoenix directly in your notebook.
Install Phoenix using:
pipinstallarize-phoenix
Within your notebook, launch Phoenix using:
import phoenix as pxpx.launch_app()
By default, notebook instances do not have persistent storage, so your traces will disappear after the notebook is closed. See Persistence or use one of the other deployment options to retain traces.
Connect your application
To collect traces from your application, you must configure an OpenTelemetry TracerProvider to send traces to Phoenix. The register utility from the phoenix.otel module streamlines this process.
If arize-phoenix is not installed in your python environment, you can use arize-phoenix-otel to quickly connect to your phoenix instance.
pipinstallarize-phoenix-otel
Connect your application to your cloud instance using:
import osfrom phoenix.otel import register# Add Phoenix API Key for tracingPHOENIX_API_KEY ="ADD YOUR API KEY"os.environ["PHOENIX_CLIENT_HEADERS"]=f"api_key={PHOENIX_API_KEY}"# configure the Phoenix tracertracer_provider =register( project_name="my-llm-app", # Default is 'default' endpoint="https://app.phoenix.arize.com/v1/traces",)
If arize-phoenix is not installed in your python environment, you can use arize-phoenix-otel to quickly connect to your phoenix instance.
pipinstallarize-phoenix-otel
Connect your application to your instance using:
from phoenix.otel import register# defaults to endpoint="http://localhost:4317"tracer_provider =register( project_name="my-llm-app", # Default is 'default' endpoint="http://localhost:4317", # Sends traces using gRPC)
You do not have to use phoenix.otel to connect to your phoenix instance, you can use OpenTelemetry itself to initialize your OTEL connection. See Using OTEL Python Directly
from phoenix.otel import register# defaults to endpoint="http://localhost:4317"tracer_provider =register( project_name="my-llm-app", # Default is 'default' endpoint="http://localhost:4317", # Sends traces using gRPC)
Instrument your application
Now we need to indicate which methods and attributes we want to trace. Phoenix has a number of built-in tracers for popular frameworks, and provides tools to manually instrument your application if needed. See here for a list of integrations
Here we're using OpenAI, so we'll install the built-in OpenAI instrumentor we provide.