Overview: Tracing
Tracing the execution of LLM applications
Last updated
Was this helpful?
Tracing the execution of LLM applications
Last updated
Was this helpful?
Tracing is a powerful tool for understanding how your LLM application works. It helps you track down issues like application latency, runtime exceptions, incorrect prompts, poor retrieval, and more.
Arize is built on top of open source packages and . We have native support for the , which is vendor-agnostic, open source, and highly performant. It includes batch processing that can handle billions of traces and spans. OpenInference standardizes traces and spans data across models, frameworks, tool calls, prompts, retrievers, and more.
It takes only a few lines of code to setup tracing using Arize with our auto-instrumentation, and it's flexible enough to add your own metadata and customize your spans.
To get started with code, check out the guide for LLM tracing and evaluation.
Learn more about tracing concepts by reading our articles on and
Deciding between automatic vs. manual instrumentation? Read how to .