How to: Tracing (Manual)

Tutorial with Open AI LLM calls, tracing various attributes.
Tutorial notebook for tracing Open AI function calls with manual instrumentation, with a specific focus on function calls. The tutorial also shows how spans can be loaded into the prompt playground for iteration in UI.
Learn more about LLM instrumentation

Arize is built on OTEL as a foundation for LLM Tracing. The platform natively supports collecting traces generated via OpenInference automatic instrumentation. Because we support OpenTelemetry, you also have the option to perform manual instrumentation, no LLM framework required.

Set up Tracing

How to Send to a Specific Project and Space ID

Get the Current Span/Context and Tracer

Log Prompt Templates & Variables

Add Attributes, Metadata and Tags to Span

Add Events, Exceptions and Status to Spans

Set Session ID and User ID

Configure OTEL Tracer

Log Input

Log Outputs

AI Powered Search & Filter

Export Traces

Tracing an Agent

Send Traces from Phoenix -> Arize

Last updated

Copyright © 2023 Arize AI, Inc