Last updated
Last updated
Tracing is a powerful tool for understanding the behavior of your LLM application. Phoenix has best-in-class tracing, regardless of what framework you use, and has first-class instrumentation for a variety of frameworks (LlamaIndex, LangChain, DSPy), SDKs (OpenAI, Bedrock, Mistral, Vertex), and Languages (Python, Javascript). You can also manually instrument your application using the OpenTelemetry SDK.
This example will walk you through how to use Phoenix to trace OpenAI requests.
Let's start by installing Phoenix. You have a few options for how to do this:
The easiest way to use Phoenix is by accessing a free persistent instance provided on our site. Sign up for an Arize Phoenix account at
Once you're there, grab your API key from the Keys option on the left bar:
To collect traces from your application, you must configure an OpenTelemetry TracerProvider to send traces to Phoenix. The register
utility from the phoenix.otel
module streamlines this process.
If arize-phoenix
is not installed in your python environment, you can use arize-phoenix-otel
to quickly connect to your phoenix instance.
Connect your application to your cloud instance using:
Here we're using OpenAI, so we'll install the built-in OpenAI instrumentor we provide.
Initialize the OpenAIInstrumentor before your application code:
From here we can use OpenAI as normal. All of our requests will be traced and reported to Phoenix automatically.
You should now see traces in Phoenix!
By default, notebook instances do not have persistent storage, so your traces will disappear after the notebook is closed. See or use one of the other deployment options to retain traces.
You do not have to use phoenix.otel to connect to your phoenix instance, you can use OpenTelemetry itself to initialize your OTEL connection. See
See for more details on configuration and setup
Now we need to indicate which methods and attributes we want to trace. Phoenix has a number of built-in tracers for popular frameworks, and provides tools to manually instrument your application if needed. See
View more details on
Run on traces
Test changes to you prompts, models, and application via
Explore for Phoenix