OpenAI

Checkout an end to end example for tracing OpenAI
See the folder for more OpenAI tutorials

Arize has first-class support for instrumenting OpenAI calls and seeing both input and output messages. We support role types such as system, user, and assistant messages, as well as function calling.

We follow a standardized format for how a trace data should be structured using openinference, which is our open source package based on OpenTelemetry. The package we are using is arize-otel, which is a lightweight convenience package to set up OpenTelemetry and send traces to Arize.

Use our code block below to get started using our OpenAIInstrumentor.

# Import open-telemetry dependencies
from arize.otel import register

# Setup OTel via our convenience function
tracer_provider = register(
    space_id = "your-space-id", # in app space settings page
    api_key = "your-api-key", # in app space settings page
    project_name = "your-project-name", # name this to whatever you would like
)

# Import the automatic instrumentor from OpenInference
from openinference.instrumentation.openai import OpenAIInstrumentor

# Finish automatic instrumentation
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)

Now start asking questions to your LLM app and watch the traces being collected by Arize. For more examples of instrumenting OpenAI applications, check our openinferenece-instrumentation-openai examples.

Tracing Function Calls

Arize simplifies debugging by automatically tracing function calls made by LLMs. With the above code, Arize will receive all function calls and allow users to review the function outputs, arguments, and any parameters used. These traces can be loaded into the prompt playground, where you can experiment with parameters, adjust function definitions, and optimize workflows.

Last updated

Was this helpful?