Quickstart: Tracing

To trace your LLM app and start troubleshooting your LLM calls, you'll need to do the following:

Install our tracing packages

Run the following commands below to install our open source tracing packages, which works on top of OpenTelemetry. This example below uses openai, and we support many LLM providers (see full list).

pip install arize-otel openai openinference-instrumentation-openai opentelemetry-sdk opentelemetry-exporter-otlp

Get your API keys

Go to your space settings in the left navigation, and you will see your API keys on the right hand side. You'll need the space key and API keys for the next part.

Add our tracing code

Arize is an OpenTelemetry collector, which means you can configure your tracer and span processor. For more OTEL configurability, see how to set your tracer for autoinstrumentors.

Are you coding with Javascript instead of Python? See our detailed guide on auto-instrumentation or manual instrumentation with Javascript examples.

The following code snippet showcases how to automatically instrument your OpenAI application.

# Import open-telemetry dependencies
from arize_otel import register_otel, Endpoints

# Setup OTEL via our convenience function
    endpoints = Endpoints.ARIZE,
    space_key = "your-space-key", # in app space settings page
    api_key = "your-api-key", # in app space settings page
    model_id = "your-model-id", # name this to whatever you would like
# Import the automatic instrumentor from OpenInference
from openinference.instrumentation.openai import OpenAIInstrumentor

# Finish automatic instrumentation

To test, let's send a chat request to OpenAI:

import openai

client = openai.OpenAI()
response = client.chat.completions.create(
    messages=[{"role": "user", "content": "Write a haiku."}],

Now start asking questions to your LLM app and watch the traces being collected by Arize. For more examples of instrumenting OpenAI applications, check our openinferenece-instrumentation-openai examples.

Run your LLM application

Once you've executed a sufficient number of queries (or chats) to your application, you can view the details on the LLM Tracing page.

Last updated

Copyright © 2023 Arize AI, Inc