Quickstart: Tracing
Last updated
Was this helpful?
Last updated
Was this helpful?
To trace your LLM app and start troubleshooting your LLM calls, you'll need to do the following:
Run the following commands below to install our open source tracing packages, which works on top of OpenTelemetry. This example below uses openai, and we support many LLM providers (see full list).
Go to your space settings in the left navigation, and you will see your API keys on the right hand side. You'll need the space key and API keys for the next part.
Arize is an OpenTelemetry collector, which means you can configure your tracer and span processor. For more OTEL configurability, see how to set your tracer for autoinstrumentors.
Are you coding with Javascript instead of Python? See our detailed guide on auto-instrumentation or manual instrumentation with Javascript examples.
The following code snippet showcases how to automatically instrument your OpenAI application.
To test, let's send a chat request to OpenAI:
Now start asking questions to your LLM app and watch the traces being collected by Arize. For more examples of instrumenting OpenAI applications, check our openinferenece-instrumentation-openai examples.
Once you've executed a sufficient number of queries (or chats) to your application, you can view the details on the LLM Tracing page.